
Forecasting the Earth’s Temperature
by David Whitehouse via Benny Peiser’s CCnet
The recent spate of scientific papers that are attempting to predict what the earth’s temperature might be in the coming decades, and also explain the current global temperature standstill, are very interesting because of the methods used to analyse temperature variations, and because they illustrate the limitations of our knowledge.
Recall that only one or two annual data points ago many scientists, as well as the most vocal ‘campaigners,’ dismissed the very idea that the world’s average annual temperature had not changed in the past decade. Today it is an observational fact that can no longer be ignored. We should also not forget that nobody anticipated it. Now, post facto, scientists are looking for an explanation, and in doing so we are seeing AGW in a new light.
The main conclusion, and perhaps it’s no surprise, to be drawn about what will happen to global temperatures is that nobody knows.
The other conclusion to be drawn is that without exception the papers assume a constantly increasing AGW in line with the increase of CO2. This means that any forecast will ultimately lead to rising temperatures as AGW is forever upward and natural variations have their limits. But there is another way of looking at the data. Instead of assuming an increasing AGW why not look for evidence of it in the actual data. In other words let the data have primacy over the theory.
Lean and Ride try to isolate and analyse the various factors that affect decadal changes in the temperature record; El Nino, volcanic aerosols, solar irradiance and AGW. Their formula that links these factors together into a time series is quite simple (indeed there is nothing complicated about any of the papers looking at future temperature trends) though in the actual research paper there is not enough information to follow through their calculations completely.
El Nino typically produces 0.2 deg C warming, volcanic aerosols 0.3 deg C cooling on short timescales, solar irradiance 0.1 deg C (I will come back to this figure in a subsequent post) and the IPCC estimate of AGW is 0.1 deg C per decade.
It should also be noted that natural forces are able to produce a 0.5 deg C increase, although over a longer period. The 0.5 deg C warming observed between say 1850 and 1940 is not due to AGW.
The temperature increase since 1980 is in fact smaller than the rise seen between 1850 – 1940, approx 0.4 deg C. This took place in less than two decades and was followed by the current standstill. A fact often overlooked is that this recent temperature increase was much greater than that due to the postulated AGW effect (0.1 deg C per decade). It must have included natural increases of a greater magnitude.
This is curious. If the recent temperature standstill, 2002-2008, is due to natural factors counteracting AGW, and AGW was only a minor component of the 1980 -1998 temperature rise, then one could logically take the viewpoint that the increase could be due to a conspiracy of natural factors forcing the temperature up rather than keeping the temperature down post 2002. One cannot have one rule for the period 2002 – 2008 and another for 1980 -1998!
Lean and Rind estimate that 73% of the temperature variability observed in recent decades is natural. However, looking at the observed range of natural variants, and their uncertainties, one could make a case that the AGW component, which has only possibly shown itself between 1980 – 98, is not a required part of the dataset. Indeed, if one did not have in the back of one’s mind the rising CO2 concentration and the physics of the greenhouse effect, one could make out a good case for reproducing the post 1980 temperature dataset with no AGW!
Natural variations dominate any supposed AGW component over timescales of 3 – 4 decades. If that is so then how should be regard 18 years of warming and decades of standstills or cooling in an AGW context? At what point do we question the hypothesis of CO2 induced warming?
Lean and Rind (2009) look at the various factors known to cause variability in the earths temperature over decadal timescales. They come to the conclusion that between 2009-14 global temperatures will rise quickly by 0.15 deg C – faster than the 0.1 deg C per decade deduced as AGW by the IPCC. Then, in the period 2014-19, there will be only a 0.03 deg C increase. They believe this will be chiefly because of the effect of solar irradiance changes over the solar cycle. Lean and Rind see the 2014-19 period as being similar to the 2002-8 temperature standstill which they say has been caused by a decline in solar irradiance counteracting AGW.
This should case some of the more strident commentators to reflect. Many papers have been published dismissing the sun as a significant factor in AGW. The gist of them is that solar effects dominated up to 1950, but recently it has been swamped by AGW. Now however, we see that the previously dismissed tiny solar effect is able to hold AGW in check for well over a decade – in fact forcing a temperature standstill of duration comparable to the recent warming spell.
At least the predictions from the various papers are testable. Lean and Rind (2009) predict rapid warming. Looking at the other forecasts for near-future temperature changes we have Smith et al (2007) predicting warming, and Keenlyside et al (2008) predicting cooling.
At this point I am reminded that James Hansen ‘raised the alarm’ about global warming in 1988 when he had less than a decade of noisy global warming data on which to base his concern. The amount of warming he observed between 1980 and 1988 was far smaller than known natural variations and far larger than the IPCC would go on to say was due to AGW during that period. So whatever the eventual outcome of the AGW debate, logically Hansen had no scientific case.
There are considerable uncertainties in our understanding of natural factors that affect the earth’s temperature record. Given the IPCC’s estimate of the strength of the postulated AGW warming, it is clear that those uncertainties are larger than the AGW effect that may have been observed.
References:
Lean and Rind 2009, Geophys Res Lett 36, L15708
Smith et al Science 2007, 317, 796 – 799
Keenlyside et al 2008, Nature 453, 84 – 88
Tom P,
Sorry [<— my apology] I haven't responded. I haven't visited this page since I last posted. I've been having some fun over on the Svensmark thread, and a couple others.
I’ve offered you several wagers, all of which you refuse to accept. So don't rag on me for not taking the bet you put out there. My wagers are no no different than what you're trying to do. I've explained exactly why upthread.
Now, back to Svensmark.
Here is a more detailed repeat of the link I mentioned above to TomP and Smokey
+++++++++++
BBC radio 4 at 1.30pm today, Vicky Pope of the Met office reluctantly admits the climate has been cooling against their expectations and models
http://news.bbc.co.uk/1/hi/programmes/more_or_less/8248922.stm#email
This is the BBC’s Tim Harford item (the link is found at the bottom of the box to the right of the item “Blowing cold, then hot”).
transcript
“Tim: If the cooling that the Leibniz Institute predicts actually takes place, are you worried that ’s going to take the wind out of some of the sails of scientists who are warning about the threat of global warming?
Vicky: It’s very important to realise that there will be ten-year periods where the temperatures don’t increase or they even decrease as the Leibniz study is suggesting –
Tim: We’ve just had one.
Vicky: Yes, in fact we have, but that doesn’t mean that global warming has stopped, it’s simply a question of natural variability, giving a temporary decrease in temperature overlaid on top of a long-term warming trend, and in fact I believe that’s what the results of that study suggest –
Tim: Sorry to interrupt but you say that were going to have ten-year periods of cooling. How can we be sure that the rapid warming we saw in the 1980s and 1990s wasn’t the exceptional period?
Vicky: This is the point really, is that 1998 was exceptionally warm because there was an El Nino, because there was a natural variation overlaid on top of climate change. So what you can see very clearly is a long-term trend and then these periods of rapid warming and less rapid warming or even cooling overlaid on top of that because of natural variations.”
This should also be seen in the context of the New Scientist interview. All I am saying is that the models did not predict the (officially) admitted cooling and they are having to take ‘natural variability’ into greater account. I make no predictions as to whether this is the start of a longer cooling trend.
tonyb’
Tom P (18:57:32) :
TonyB,
Quite correct that there was an admission of a drop. But not that it was against models and expectations – rather that natural variability can occasionally cause such dips.
Tom, do you accept that such natural variables as can cause occasional decadal dips by overcoming the almighty power of co2 can also therefore cause occasional decadal rises? Logic would seem to demand it, because if they are strong enough to overcome co2, they must rise as often as fall, or temperature would have diminished over the C20th not risen.
If you accept the inexorable logic of this, you also accept that some of the temperature rise attributed to co2 was is fact due to the positive phases of these other natural factors.
This has a considerable effect on the calculation of the co2 radiative forcing.
What say you?
Smokey,
“I’ve offered you several wagers, all of which you refuse to accept.”
These were:
1.” No one will be able to falsify the theory that global warming/cooling will go outside of its natural historical parameters.”
This statement is nonsense, though it probably means the opposite of what you intended.
2. “The planet will not be 3° C warmer in the next ten years.”
Certainly way beyond any projections. This would need a heat input way beyond anything humans could produce. This a joke bet that only demonstrates a lack of good faith.
3. “That within the next ten years we will never reach the UN/IPCC’s AR-4 projections.”
This is better, but as I said before, this needs some numbers to be a meaningful bet. The IPCC projections can be found in AR4 here in figure 10.5:
http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-chapter10.pdf
I take you are therefore willing to bet that the monthly UAH lower troposphere temperature anomaly for the next ten years will never reach the equivalent ensemble average time series anomaly for any of the three scenarios outlined in IPCC AR4 figure 10.5.
You pay up if and when any of the projections is reached. I pay up if after ten years the projections have never been reached.
How much do you want to bet? We can start next month.
Christopher Hanley – “But the temperature rise prior to 1940 cannot be attributed to anthropogenic emissions, i.e. the burning of fossil fuels.”
CO2 did not become a greenhouse gas in 1940. Your statement makes no sense.
Jim Clarke – “[the hypothesis of CO2 induced warming] only described about 60 years of the billions of years of climate history”
Another greenhouse effect denier! Amazing! The climate record, as a whole, could not be accounted for if there was no greenhouse effect. Your statement makes no sense.
An Inquirer – you’ve misunderstood the IPCC as well. Even if one factor is dominant, it does not follow that all other factors are negligible. You cannot account for the climate of the last 150 years without accounting for the greenhouse effect of CO2. You also cannot account for the climate of the last 150 years without accounting for the influence of the Sun and volcanoes. I think you’ve got some revision to do.
“As a scientist, I have studied more than a dozen proposed key variables in what has caused trends in the past, and CO2 emissions rank toward the bottom of variables that apparently have had a major influence.”
Where is your work published? Please give the journal reference.
What price El Nino?
From
http://weather.unisys.com/archive/sst/sst_anom_loop.gif
ending september 6, to:
http://weather.unisys.com/surface/sst_anom.html
seems it is weaker and weaker.
I remember Nina biting last year in the same plots.
tallbloke,
“Tom, do you accept that such natural variables as can cause occasional decadal dips by overcoming the almighty power of co2 can also therefore cause occasional decadal rises? Logic would seem to demand it, because if they are strong enough to overcome co2, they must rise as often as fall, or temperature would have diminished over the C20th not risen.
If you accept the inexorable logic of this, you also accept that some of the temperature rise attributed to co2 was is fact due to the positive phases of these other natural factors.
This has a considerable effect on the calculation of the co2 radiative forcing.
What say you?”
What we see in the satellite record are episodic rather than periodic inputs which cause a relatively brief excursion of a few years or so. I don’t see how they can have long-term effect on climate. All the rest looks like noise that would average to zero as well.
Longer term there may be some periodic variations. They would certainly modulate any background trends. There is some recent work by Huang in isolating such variations using empirical mode decomposition of the HadCRUT dataset:
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1986583
This paper shows that the background trend is by far the largest part of the world’s temperature history, with perhaps a little more than 0.1C out of today’s 0.8 C rise attributable to long term oscillations. Given the other uncertainties in the CO2 forcing factor, a natural contribution to the temperature profile on this scale does not have an appreciable effect.
So, no doubt there are natural oscillatory contributions to the warming we see, but their effect is swamped by a longer term rise over the last 150 years.
Has anyone seen Smokey? He owes me both an attempt to explain this temperature history, as well as getting back to me on the bet he offered and which I’m ready to accept. If anyone else would like to respond in his place, please go ahead.
Ron Mexico (21:13:03) :
“These climate guys are young in their efforts…”
So true. Someone elsewhere suggested they should get these guys working on the Unified Field Theory, as they could probably knock it out in just a few years.
George E. Smith (14:41:48) :
“The aliasing noise injected by failure to observe proper sampling protocol, is in band noise so it is permanently unremovable without throwing away good signal as well. ”
Just because a signal has aliased components does not mean it is not useful. The eyes with which you are reading this only sample at something like 30 Hz. The question is, what signals are there beyond the Nyquist frequency, are they significant, and to what band are they aliasing? Often, we are interested only in low frequency information. This is why many sensors are of the integrating variety, because we can turn the delta-integrated data into an average, and averaging naturally attenuates signals that would alias to the low frequency spectrum.
E.M.Smith (16:43:14) :
“An average of a bunch of thermometer readings MEANS NOTHING.”
You have a point. We do get a number that represents something, but what? It may be usefully compared with other samples, to the degree that they are similarly obtained, and we can thereby establish historical patterns, but is there really any basis for IPCC proclamations to the effect of “a 0.3 degree rise in global temperature will be catastrophic”? Maybe not so much. Thanks for bringing up the food for thought.
J. Bob (21:14:44) :
“The Fourier Convolution filter is preferred in that it covers the most recent end point, which the MOV and Recursive does not. ”
No, it is just interpolating the end points implicitly. What you should really be doing is designing a filter for a specific bandwidth so you know precisely what the frequency content of your output is. You should do a PSD to get an idea of precisely what the frequency content is. You can use a filtering algorithm such as this one to get a zero-phase result which provides the endpoints via a consistent approach.
Interesting discussions, all. Thanks.
I did look at the whole PSD plot. Unfortunately I gave my Matlab and all the toolboxes away over a decade ago, so I’m using EXCEL & VB. However running the data both directions is one thing I’ve tried on the FFT part. The “filfit” part I don’t remember in the Sig. Tool Box I had at the time. We did use “Padde” methods to correct for phase shift, but I think Sig. Cond. offers some interesting insights. Will take a look at the “filfit” and see what shows up. What did you think of averaging the Rimfrost data?
Apologies J Bob – I guess I got caught up in the technicalities and forgot to comment on the actual substance.
If I took the pro-AGW side, I guess I would look at the last upswing and view it as proof that things are getting hotter. On the other side, I would note that the slope is nowhere out of the ordinary. Moreover, you could argue that the trend is tapering off at the end, as your FFT filter seems to indicate. But, it can then be argued that this is an artifact of how the FFT filter handles the endpoints via circular convolution.
What I would do with this data is use the PSD to determine the major frequency components, then create a model composed of the sum of sinusoids at those frequencies with amplitude and phase parameters and use least squares estimation techniques to derive those parameters. Actually, now that I think of it, the easiest way is as follows. Say I have frequency spikes at just omega1 and omega2. My model is
M = [a1 b1 a2 b2] * [cos(omega1*t) sin(omega1*t) cos(omega2*t) sin(omega2*t)]’
This is a linear model, and the coefficients a1, b1, a2, and b2 can easily be estimated.
The thing is, with this data, observability of a periodic component longer than 400 years is very low. For all we know, if we could accurately extend the data series farther back, we might see a repeat of the constructive interference which may be occuring near the end of the data record in which a roughly 400 year cycle adds to the 60 year cycle, causing a local maximum independent of anthropogenic forcing. You might hypothesize such a component and try it out in the estimation process above, and see if you can’t replicate the behavior.
You probably can. That’s really the whole nub of the problem with the AGW models. They also can be tweaked to fit the data over any finite interval. But, that does not mean the models are “truth”.
Bart, what I will do is post a more detail portion of the FFT part of the analysis on this thread. One of the 1st things we did in a spectral analysis is to “echo”, or compare a unfiltered output to the original input. That saved a lot of arguments later on. If the unfiltered matches the input over the range, can one say there are “losses” at the end points?
Bart,
“That’s really the whole nub of the problem with the AGW models. They also can be tweaked to fit the data over any finite interval. But, that does not mean the models are “truth”.”
There’s no need to tweak any parameters – if you use the right analytical tools. Fourier transform techniques are not the best way to isolate periodic signals and trends as they can’t deal well with longer-term trends across the time series. This is, of course, unsurprising as a fourier transform can only deconstruct a signal into a series of sinusoids. Windowing only contributes by throwing away information concerning any trend to artificially force periodicity onto the system.
Empirical mode deconstruction is a more promising approach, and requires no prior assumptions concerning the periodicity or otherwise of the data – there’s just nothing to tweak.
Have a look at the article I cited above:
http://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1986583
As I mentioned, the most prominent signal that comes out of the data is a long-term warming trend overlaid with a much smaller 65-year cyclic component. Empirical mode deconstruction is an unbiased way to isolate any warming trend from the data, and shows quite clearly that the world has indeed been warming for the last 150 years.
Tom – no matter how you slice it, with a finite data set, your confidence in estimation of any periodic process with a period approaching or exceeding the timeline of the data set diminishes rapidly. Likewise, separating an actual trend from a long term periodicity becomes increasingly problematical.
There exists no magical analytical tool which can surmount this difficulty. It is fundamental, like the uncertainty principle in quantum mechanics, or the Cramer-Rao lower bound (in fact, it is a manifestation of the Cramer-Rao lower bound).
We use Fourier analysis because every periodic signal can be represented as a Fourier series. Every single one. This is one of the most basic results of functional analysis.
You have become enamored of an approach for which you do not understand the implicit assumptions. You cannot rule out a long term perodicity in the data. If you insist to me that you can, it will merely confirm for me that you do not know what you are talking about.
J Bob – what happens at the end points is necessarily a function of how you process the data. Information regarding what is happening beyond the end points is simply unavailable, and that information, unfortunately, is necessary to determine what is happening there with strong confidence.
Every path leads to assumptions, either explicit or implicit. For instance, in the approach I recommended of fitting periodic sinusoids, I am making the assumption that the model is periodic, and that the periodicity extends beyond the boundaries. This is generally a well-founded assumption, based on the behavior of countless observed processes in general circumstances.
Perhaps we are not dealing with a periodic signal? Perhaps, as Tom suggests, there is an actual linear, quadratic, cubic, or whatever polynomial trend? Maybe, maybe not. But, fundamentally, we cannot know from this finite chunk of data alone. Tom and the Warmists have made the assumption that there is, based on their intuition (or prejudice) regarding a significant human impact on the environment. Other’s, such as I, tend to intuit (or prejudice) a less significant impact.
Can we say objectively which one of us is right? No, we honestly cannot with the available information. But, we can objectively state that the Warmist agenda will bring immediate and severe hardship whether they are right or wrong, whereas the other direction only potentially brings hardship in the long term, before which we likely will have advanced far enough that we can deal with whatever occurs. That is what tips the balance for me.
“You have become enamored of an approach for which you do not understand the implicit assumptions.”
Perhaps I should have said “You have become enamored of an approach for which you do not, perhaps willfully, appreciate the implicit assumptions.”
A little less pejorative. I have wasted a lot of time in blog conversations which degrade into escalating meaningless barbs regarding the disputants’ native intelligences. I will take it as a given that Tom is a smart guy, and hope the courtesy is reciprocated, and I merely wish to open his eyes to considerations which he may not have… considered.
Tom P (13:52:22) :
So, no doubt there are natural oscillatory contributions to the warming we see, but their effect is swamped by a longer term rise over the last 150 years.
But given that the co2 effect is not generally thought to have become important until after WWII, the previous 2/3 of this long term rise has to be something else. If you are pleading some sort of special case that whatever previously caused the long term rise suddenly stopped, amd co2 took over, you will have to be more specific about what that other factor was, or your case looks very untenable. Whatever it was, it couldn’t be random stuff that cancels out to zero after a few years, that’s for sure.
Tom – I have just reviewed your link. It makes me want to groan and put my head in my hands and sob.
He gets an arbitrary set of “IMF”s from cubic spline interpolation, which have no physical basis, and do not themselves generally form a basis, and subtracts them out of the data until he has no more local extrema. It’s gobbledy-goop.
There is a reason we choose trig functions and polynomials as bases. It is because that is the form a vast array of physical processes assume, because of natural integrations and projections, and the regularity of time as we define it. If you unmoor yourself from these functions, you have no basis (literally) to extrapolate beyond the boundaries.
There is no magic silver bullet here that would allow you to divine information which simply is not available. Moreover, there is no basis (again, literally) to interpret the result. There is no linkage to naturally occurring processes which would tend to support one’s interpretation, whatever that may be. There is only a process for manipulating data.
There is only a process for manipulating data until you end up with a monotonic function of undetermined form.
Bart,
“He gets an arbitrary set of “IMF”s from cubic spline interpolation, which have no physical basis, and do not themselves generally form a basis, and subtracts them out of the data until he has no more local extrema. It’s gobbledy-goop.”
You have rather misunderstood empirical mode decomposition. Firstly it is a pure signal analysis technique – no assumptions need to be made about any physical basis to extract the components. This is part of its power, not a criticism.
As you said earlier “I am making the assumption that the model is periodic…” There is no need to make such an assumption. And there are plenty of physical processes that are non-periodic – a random walk for instance.
EMD does indeed produce an orthogonal basis set – otherwise it would be a meaningless technique.
And EMD most certainly does produce useful results, as its adoption in extracting information from seismic and medical data processing can demonstrate.
You are quite right, EMD decomposes signal until a monotonic function remains. But this could be part of a longer period cyclic function. If this was indeed the case, the upwards trend seen over the last 150 years could be interpreted as part of a 600-year sinusoid with an peak-to-peak amplitude of 1.2C, a minimum at 1850 and a maximum a predicted maximum at 2150. There is, however, physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal, for instance:
http://img9.imageshack.us/img9/1994/glaciervsinstrumental.png
tallbloke,
“the co2 effect is not generally thought to have become important until after WWII”
Where did you get that from? The general view is that CO2 warming became discernible from the middle of the last century as CO2 concentrations started to rise above background levels:
http://cdiac.ornl.gov/trends/co2/graphics/lawdome.gif
There in fact appears to be a close relationship between this plot and the glacier-derived temperatures above, or would you argue otherwise?
Sorry – a couple of confusing typos:
You are quite right, EMD decomposes the signal until a monotonic function remains. But this could be part of a longer period cyclic function. If this was indeed the case, the upwards trend seen over the last 150 years could be interpreted as part of a 600-year sinusoid with a peak-to-peak amplitude of 1.2C, a minimum at 1850 and a predicted maximum at 2150. There is, however, no physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal…
I admit to being lost about all the deconstruction and reconstruction of temperatures to provide trends or no trends, and isolate or not isolate what part of the trends or no trends are due to human influence primarily from CO2 or from natural causes. What concerns me is the implicit assumption that CO2 has been monotonically increasing during all the periods being reviewed, so that this is the one constant in all the debate and analysis. My point is that if CO2 has not increased monotonically, what relevance does the mathematical techniques concerning temperature have on the central question as to whether CO2 has caused warming, or at least to the magnitude assumed by IPCC, and therefore can it have played a part in temperature increases?
We should not allow the IPCC to choose the CO2 readings should we? What if the long term trend of CO2 in the atmosphere was a one quarter of the assumed increase? How then can one attribute increased/decreased temperature to CO2?
What evidence do I have for the proposition that CO2 has not been increasing? Please look here http://www.21stcenturysciencetech.com/Articles%202007/20_1-2_CO2_Scandal.pdf
I could reproduce exerpts from this if anyone desires, but if you have an open mind you would read all of it for yourself. Jaworowski has a huge reputation and had to be marginalised by the AGW believers to make their case. Another example of AGW cherry picking. Lets look at the evidence, not just the maths techniques.
Tom P (18:08:15) :
tallbloke,
“the co2 effect is not generally thought to have become important until after WWII”
Where did you get that from? The general view is that CO2 warming became discernible from the middle of the last century as CO2 concentrations started to rise above background levels:
http://cdiac.ornl.gov/trends/co2/graphics/lawdome.gif
There in fact appears to be a close relationship between this plot and the glacier-derived temperatures above, or would you argue otherwise?
Of course I would argue otherwise, that’s what keeps these debates interesting. 😉
If you want to argue that the co2 effect kicked in earlier, you need to get specific about what caused the cooling in the late 1800’s and the 1940-1970 period. Which brings us back to the oceanic oscillations and solar variation the AGW hypothesis has to dismiss as random small scale noise to survive.
Then there is the complete lack of correlation between co2 and the Medieval warm period. Of course, Law Dome is a long way from the location of the historical records which show the MWP was warmer than now in many parts of the world, but that’s a deficiency of the data you are trying to argue from, not evidence that the MWP didn’t happen.
Then of course there are the co2 measurements made by C19th scientists which don’t fit the theory, and have been quietly dropped…
Speaking of dropping, this thread is about to drop off the bottom of the list, so cheers, and see you on the next thread which tickles both our interests.
Alan Sutherland,
“Jaworowski has a huge reputation, and had to be marginalised by the AGW believers to make their case.”
He certainly has a reputation for his past work in radiation effects. But the article you cite (and his other writings on climate change) are found in the non-refereed “21st Century Science and Technology”, a magazine published by the Lyndon LaRouche movement, perhaps the most bizarre political grouping in the US. Jaworowski has marginalised himself here!
tallboy,
“If you want to argue that the co2 effect kicked in earlier, you need to get specific about what caused the cooling in the late 1800’s and the 1940-1970 period.”
No, first you have to explain the dominant warming trend in the signal. Of course the warming has not be always increasing, and there are weaker cooling periods to explain, some of which I’m sure are natural variability. But to dismiss an effect from CO2 on such grounds is to ignore the elephant in the room.
“There is, however, no physical basis for such a long-term periodicity and more importantly no sign in the earlier record of such a signal.”
There are all kinds of perodicities. Here, for example, is a 400 year cycle. I’m not promoting or otherwise affirming the paper – I just pulled it up randomly on google. I’m just saying, there is a lot more going on than the rising CO2 narrative which we are being spoon fed by the promoters of AGW.
I’m not saying there is a malevolent conspiracy afoot, at least a conscious one, among AGW adherents. But, I do believe they, like the Queen of Hearts, made their verdict first, and held the trial afterward. And, when you are searching only for evidence which supports your preconceived bias, mirabile dictu, that is what you tend to find. I believe they, in their hearts, believe they are doing the right thing, but it all flows from the initial conviction: we are doing something to the planet, let’s find out what.
The first principle is that you must not fool yourself – and you are the easiest person to fool.
Richard Feynman, Caltech commencement address, 1974
Bart, here is a more detail description of my Fourier Analysis .
The following is a expansion on use of Fourier Convolution methods in signal conditioning, or attempting to get information out of “noisy” signals. The methods used were those recommended by Blackman & Tucky’s book “Measurement of Power Spectra”. These methods were later refined by Cooley & Tucky’s presentations on the Fast Fourier Transform, which they developed.
The signal in question, was a average of 14 very long term temperature records starting in a period from 1659 (Central England) to about 1800. The data came from the Rimfrost site, http://www.rimfrost.no/ Primary purpose was to look at direct temperature measurement records, and evaluate how current temperatures compared to those earlier ones 150 to 300 years ago.
One of the tools used, along with moving averages and recursive filtering, was Fourier Convolution or filtering. In Fourier filtering, a input signal is converted to the frequency domain. There, the frequency content is evaluated, and certain frequencies of kept or removed, depending on the user. In this case, frequencies greater the 0.025 cycles/year (40 year period) were removed. The result was then transformed back to the time domain for analysis
The top insert in the figure below show the averaged temperature (Ave14), along with a de-trend line. This line intersects the end point of the data set. This “de-trending” done to avoid “leakage” problems with Fourier convolution. The lower figure shows the difference, or error from the de-trend line, noting that both end points are equal to zero.
http://www.imagenerd.com/uploads/ave14-de-trend-CkLob.gif
The actual convolution is performed on this “error” from the “de-trend” line, and is shown below. In this case, the “error”, or difference, is inserted into a sample frame of 512 sample (due to FFT reqts.). The non-Ave14 portion is padded with zeros so no discontinuity is present at the sample end points. This is shown in the top insert in the figure below.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo1-2Tgav.gif
The top insert is two plots superimposed of the input and the output of the convoluted, or filtered input. In this case, no filtering is applied, the purpose being the check the computational procedures. Notice that the output is virtually on top of the input, indicating a good re-construction of the input signal. The lower insert is the power spectral density plot (actually ½ of it) that shows the energy contained in the various frequencies. Note that amplitudes get smaller at higher frequencies, while more energy is concentrated in the lower ones, stating about 0.14 cycles per year, or 8 year periods. This tapering off of energy at the higher frequency, also indicates “pre-whitening” is not needed.
The next figure shows the same data shifted to the center of the sample period. As expected the PDS is the same even though the signal has been shifted to the center of the sample period. This would indicate that “windowing”, using a Hamming, Hanning or other “window”, would not be needed.
http://www.imagenerd.com/uploads/ave14-raw-fft-echo2-p9mBY.gif
Again the reconstructed signal is virtually identical to the input, indicating the how well the Fourier filter can reconstruct a signal.
The figure below shows the effect of a “mask” that removes frequencies above 0.025 cycles/year. Basically a low pass filter.
http://www.imagenerd.com/uploads/ave14-40_yr_filter-sct4B.gif
The top insert shows the input and out signal, while the lower one show the PSD plot, after the “mask” is applied. The resultant signal is a smoothed line, that shows the lower frequency content of the input, uncluttered by the higher frequencies.
He last step is to use the “de-trend” line and the filtered signal to re-construct the filtered average (Ave14), shown below.
http://www.imagenerd.com/uploads/ave14-de-trend_40_yr_filter-esCIc.gif
Here one can see that the filtered signal does a good job of following the signal. The question is what happens at the beginning and end points, especially the end one. While all “tools” have their strength and weaknesses, a variety of methods are generally used to evaluate data, and this is but one. However, it would appear that the Fourier does seem to work a little better at the end points, especially if there are real cyclical elements embedded in the signal. This seemed to be confirmed by looking at the Chebushev filter results in a previous posting. This is in spite of transients caused by the sudden addition of more stations.
Looking at the end portion, one could make the case that we are entering into a downward cycle in the climate. However, in the PDS there are other frequencies that have a considerable amount of “energy”. There is a group in the 0.025 to 0.07 range, as well as 0.1 to 0.4 range that warrant evaluation. The one thing that does stand out, is that this crude analysis of western Europe indicates considerable temperature variation in western Europe.. It would indicate that there has been fairly warm periods in the past, and the 1850 date point seemed to be a relative low point in the climate temperature cycle. Anyway there is still a lot to do, (solar & north Atlantic variatin for starters).
An attempt also was made to keep the analysis simple, so that one who has some programming knowledge of VB and EXCEL can do a fair amount of analysis on their own.
Bart, my MATLAB is more then a little rusty, so if you could formulate your model in a more standard notion, I would appreciate it.