Crowdsourcing A Full Kernel Cascaded Triple Running Mean Low Pass Filter, No Seriously…

Fig 4-HadCrut4 Monthly Anomalies with CTRM Annual, 15 and 75 years low pass filters

Image Credit: Climate Data Blog

By Richard Linsley Hood  – Edited by Just The Facts

The goal of this crowdsourcing thread is to present a 12 month/365 day Cascaded Triple Running Mean (CTRM) filter, inform readers of its basis and value, and gather your input on how I can improve and develop it. A 12 month/365 day CTRM filter completely removes the annual ‘cycle’, as the CTRM is a near Gaussian low pass filter. In fact it is slightly better than Gaussian in that it completely removes the 12 month ‘cycle’ whereas true Gaussian leaves a small residual of that still in the data. This new tool is an attempt to produce a more accurate treatment of climate data and see what new perspectives, if any, it uncovers. This tool builds on the good work by Greg Goodman, with Vaughan Pratt’s valuable input, on this thread on Climate Etc.

Before we get too far into this, let me explain some of the terminology that will be used in this article:

—————-

Filter:

“In signal processing, a filter is a device or process that removes from a signal some unwanted component or feature. Filtering is a class of signal processing, the defining feature of filters being the complete or partial suppression of some aspect of the signal. Most often, this means removing some frequencies and not others in order to suppress interfering signals and reduce background noise.” Wikipedia.

Gaussian Filter:

A Gaussian Filter is probably the ideal filter in time domain terms. That is, if you consider the graphs you are looking at are like the ones displayed on an oscilloscope, then a Gaussian filter is the one that adds the least amount of distortions to the signal.

Full Kernel Filter:

Indicates that the output of the filter will not change when new data is added (except to extend the existing plot). It does not extend up to the ends of the data available, because the output is in the centre of the input range. This is its biggest limitation.

Low Pass Filter:

A low pass filter is one which removes the high frequency components in a signal. One of its most common usages is in anti-aliasing filters for conditioning signals prior to analog-to-digital conversion. Daily, Monthly and Annual averages are low pass filters also.

Cascaded:

A cascade is where you feed the output of the first stage into the input of the next stage and so on. In a spreadsheet implementation of a CTRM you can produce a single average column in the normal way and then use that column as an input to create the next output column and so on. The value of the inter-stage multiplier/divider is very important. It should be set to 1.2067. This is the precise value that makes the CTRM into a near Gaussian filter. It gives values of 12, 10 and 8 months for the three stages in an Annual filter for example.

Triple Running Mean:

The simplest method to remove high frequencies or smooth data is to use moving averages, also referred to as running means. A running mean filter is the standard ‘average’ that is most commonly used in Climate work. On its own it is a very bad form of filter and produces a lot of arithmetic artefacts. Adding three of those ‘back to back’ in a cascade, however, allows for a much higher quality filter that is also very easy to implement. It just needs two more stages than are normally used.

—————

With all of this in mind, a CTRM filter, used either at 365 days (if we have that resolution of data available) or 12 months in length with the most common data sets, will completely remove the Annual cycle while retaining the underlying monthly sampling frequency in the output. In fact it is even better than that, as it does not matter if the data used has been normalised already or not. A CTRM filter will produce the same output on either raw or normalised data, with only a small offset in order to address whatever the ‘Normal’ period chosen by the data provider. There are no added distortions of any sort from the filter.

Let’s take a look at at what this generates in practice.The following are UAH Anomalies from 1979 to Present with an Annual CTRM applied:

Fig 1-Feb UAH Monthly Global Anomalies with CTRM Annual low pass filter

Fig 1: UAH data with an Annual CTRM filter

Note that I have just plotted the data points. The CTRM filter has removed the ‘visual noise’ that a month to month variability causes. This is very similar to the 12 or 13 month single running mean that is often used, however it is more accurate as the mathematical errors produced by those simple running means are removed. Additionally, the higher frequencies are completely removed while all the lower frequencies are left completely intact.

The following are HadCRUT4 Anomalies from 1850 to Present with an Annual CTRM applied:

Fig 2-Jan HadCrut4 Monthly Anomalies with CTRM Annual low pass filter

Fig 2: HadCRUT4 data with an Annual CTRM filter

Note again that all the higher frequencies have been removed and the lower frequencies are all displayed without distortions or noise.

There is a small issue with these CTRM filters in that CTRMs are ‘full kernel’ filters as mentioned above, meaning their outputs will not change when new data is added (except to extend the existing plot). However, because the output is in the middle of the input data, they do not extend up to the ends of the data available as can be seen above. In order to overcome this issue, some additional work will be required.

The basic principles of filters work over all timescales, thus we do not need to constrain ourselves to an Annual filter. We are, after all, trying to determine how this complex load that is the Earth reacts to the constantly varying surface input and surface reflection/absorption with very long timescale storage and release systems including phase change, mass transport and the like. If this were some giant mechanical structure slowly vibrating away we would run low pass filters with much longer time constants to see what was down in the sub-harmonics. So let’s do just that for Climate.

When I applied a standard time/energy low pass filter sweep against the data I noticed that there is a sweet spot around 12-20 years where the output changes very little. This looks like it may well be a good stop/pass band binary chop point. So I choose 15 years as the roll off point to see what happens. Remember this is a standard low pass/band-pass filter, similar to the one that splits telephone from broadband to connect to the Internet. Using this approach, all frequencies of any period above 15 years are fully preserved in the output and all frequencies below that point are completely removed.

The following are HadCRUT4 Anomalies from 1850 to Present with a 15 CTRM and a 75 year single mean applied:

Fig 3-Jan HadCrut4 Monthly Anomalies with CTRM Annual, 15 and 75 years low pass filters

Fig 3: HadCRUT4 with additional greater than 15 year low pass. Greater than 75 year low pass filter included to remove the red trace discovered by the first pass.

Now, when reviewing the plot above some have claimed that this is a curve fitting or a ‘cycle mania’ exercise. However, the data hasn’t been fit to anything, I just applied a filter. Then out pops some wriggle in that plot which the data draws all on its own at around ~60 years. It’s the data what done it – not me! If you see any ‘cycle’ in graph, then that’s your perception. What you can’t do is say the wriggle is not there. That’s what the DATA says is there.

Note that the extra ‘greater than 75 years’ single running mean is included to remove the discovered ~60 year line, as one would normally do to get whatever residual is left. Only a single stage running mean can be used as the data available is too short for a full triple cascaded set. The UAH and RSS data series are too short to run a full greater than 15 year triple cascade pass on them, but it is possible to do a greater than 7.5 year which I’ll leave for a future exercise.

And that Full Kernel problem? We can add a Savitzky-Golay filter to the set,  which is the Engineering equivalent of LOWESS in Statistics, so should not meet too much resistance from statisticians (want to bet?).

Fig 4-Jan HadCrut4 Monthly Anomalies with CTRM Annual, 15 and 75 years low pass filters and S-G

Fig 4: HadCRUT4 with additional S-G projections to observe near term future trends

We can verify that the parameters chosen are correct because the line closely follows the full kernel filter if that is used as a training/verification guide. The latest part of the line should not be considered an absolute guide to the future. Like LOWESS, S-G will ‘whip’ around on new data like a caterpillar searching for a new leaf. However, it tends to follow a similar trajectory, at least until it runs into a tree. While this only a basic predictive tool, which estimates that the future will be like the recent past, the tool estimates that we are over a local peak and headed downwards…

And there we have it. A simple data treatment for the various temperature data sets, a high quality filter that removes the noise and helps us to see the bigger picture. Something to test the various claims made as to how the climate system works. Want to compare it against CO2. Go for it. Want to check SO2. Again fine. Volcanoes? Be my guest. Here is a spreadsheet containing UAH and a Annual CTRM and R code for a simple RSS graph. Please just don’t complain if the results from the data don’t meet your expectations. This is just data and summaries of the data. Occam’s Razor for a temperature series. Very simple, but it should be very revealing.

Now the question is how I can improve it. Do you see any flaws in the methodology or tool I’ve developed? Do you know how I can make it more accurate, more effective or more accessible? What other data sets do you think might be good candidates for a CTRM filter? Are there any particular combinations of data sets that you would like to see? You may have noted the 15 year CTRM combining UAH, RSS, HadCRUT and GISS at the head of this article. I have been developing various options at my new Climate Data Blog and based upon your input on this thread, I am planning a follow up article that will delve into some combinations of data sets, some of their similarities and some of their differences.

About the Author: Richard Linsley Hood holds an MSc in System Design and has been working as a ‘Practicing Logician’ (aka Computer Geek) to look at signals, images and the modelling of things in general inside computers for over 40 years now. This is his first venture into Climate Science and temperature analysis.

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
355 Comments
Inline Feedbacks
View all comments
Greg Goodman
March 17, 2014 1:48 am

Nice article Richard.
Once criticism is the 75y RM . Having roundly accepted how bad and distorting these damn things are you still put one in there. And what is shows is not that much use even if we could believe it.
In general, when the kernel length appears to be a problem you will get a “smoother”, non distorted result with 3RM of half the period. That will not help remove a 60 periodicity.
BTW, I think this ‘cycle’ is closer to a full-wave rectified cosine : abs(cos(x)) than a fully harmonic oscillation. Heavy filtering tends to round things off a bit two much and everything ends up looking like cosines.
This seems to fit arctic ice, though it’s way too soon to see how this up tick will go.
http://climategrog.wordpress.com/?attachment_id=783
Similar abs_cos form in AMO and cyclone ACE:
http://climategrog.wordpress.com/?attachment_id=215
My money is on circa 128y abs_cos rather than 60y full cosine. Fourier techniques will tend to mask this too unless the analyst looks carefully at the presence of harmonics.
Great article though. The traffic on WUWT should be helpful in getting greater awareness of the defects and distortions of runny means.

Greg Goodman
March 17, 2014 1:49 am

two much = too much 😉

Kelvin vaughan
March 17, 2014 1:59 am

Filter it enough and you will end up with a straight line showing a slow rise in temperature.

george e. smith
March 17, 2014 2:05 am

Filters throw away information.
The raw data, is the most information you can ever have.
Filters simply delude one into believing that something else is happening; other that what the measured sampled data values tell.

Greg Goodman
March 17, 2014 2:16 am

“Filters throw away information.”
No, filters separate information. What you “throw” away is your choice.
“Filters simply delude one into believing that…”
Delusion is in the eye of the beholder. That some may draw wrong conclusions or make inappropriate analyses is not a reason never to analyse data.
This whole “the data IS the data” meme is totally ignorant. You are not going to learn much by staring at noisy dataset with a massive annual signal except that the data is noisy and has a strong annual cycle. If that is all you feel competent to do them by all means stay where you feel comfortable. That does not means that no one ever got any useful information and that centuries of expertise in data processing are nothing but delusional.
Get real.

Greg Goodman
March 17, 2014 2:24 am

Bernie Hutchins. “… the instructor’s words were exactly: “Astoundingly, this is an FIR filter.” I WAS astounded ”
So am I.
I’ve always been dubious of LOESS filters because they have a frequency response that changes with the data content. I’ll have to have a closer look at SG.
It does have a rather lumpy stop band though. I tend to prefer Lanczos for a filter with a flat pass band ( the major defect of gaussian and the triple RM types ).

Orson
March 17, 2014 2:38 am

The back-and-forth comments from our more mathematically learned friends is quite excellent for us noobs. All too often, technical analysis goes under the radar of ordinary minds. Thanks, Anthony, for giving us a chance to see some ‘worksheets’ being used…up?

March 17, 2014 2:45 am

I am very grateful to RichardLH for his courteous answers to my questions.
But I should like more detail on why the particular function he has chosen is any better at telling us what is happening in the data than the linear trends used by the IPCC etc. No trend on stochastic data has any predictive value. And we already know from the linear trend on global temperature that the rate of warming is very considerably below what was predicted – indeed, that there has been no global warming to speak of for about a quarter of a century. So, what does your curve tell us about global mean surface temperature that we do not already know?

Editor
March 17, 2014 2:52 am

Greg Goodman says:
March 17, 2014 at 12:48 am

Willis:

“As you can see, the two are so similar that you cannot even see your filter underneath the gaussian filter … so I repeat my question. Why do we need a new filter that is indistiguishable from a gaussian filter?”

It is not indistinguishable , as Richard correctly says it totally removes an annual signal. You attempt to show this is “indistiguishable” by using them to filter temperature “anomaly” data that has already had most of the annual signal removed.
Now do the same with actual temps and you will see the difference. The advantage of having a filter than can fully remove the huge annual cycle is that you don’t need to mess about with “anomaly” data which themselves leak , possibly inverted 12mo signal as soon as the annual cycle changes from that of the reference period.

Not true in the slightest, Greg. You really should try some examples before uncapping your electronic pen.

Once again, the CTRM filter underneath the gaussian filter is scarcely visible … so I repeat my question. Why do we need a new filter that is indistiguishable from a gaussian filter?
w.

cd
March 17, 2014 3:04 am

Richard or others
Nice concise post. I have three questions – sorry.
1) Are the cascade filters based on recursive operations using some basis functions? If yes, then is this not akin to curve fitting? I’ve probably misunderstood but I think you’ve taken issue with this before.
2) In the field I work in we tend to use the Butterworth filter extensively; but in light of the recent post on “signal stationarity” in WUWT, does such an approach seem inappropriate if the composed data series is the sum of many non-stationary signals. I suspect that there will be major issues with significant localised phase shifts (something I understand to be a problem with the Butterworth filter even with “perfect” data sets).
3) Finally, for the purposes of quant operations, is the filtered data any use beyond signal analysis? For example, even if one uses the best filtering method, does the resulting processed series not take with it significant bias: when one tries to statistically quantity something like correlation between two filtered data sets. This seems a little open ended I know, but you see this type of analysis all the time.

Editor
March 17, 2014 3:12 am

Greg Goodman says:
March 17, 2014 at 1:25 am

Willis: “As tempting as it may be to read a “cycle” into it, there is no “~ 60 year cycle”. It’s just a wiggle. ”
There is a long term change in there too Willis. Cooling at end of 19th c , warming since beginning of 20th. When you add a slope to a pure cosine you will find that it shifts the peaks and troughs, when you have differing slopes behind it they will get moved back and forth. That is the cause of the irregularity of the intervals you have noted.

And you know that this “is the cause of the irregularity” exactly how?

Once again, you display your lack of understanding and over-confidence in your own abilities, then start telling others the way it is.

Me? I’m the one saying it’s premature to claim cycles with periods in the temperature record. You’re the one trying to tell us the way it is. You’re the one claiming that there are regular cycles in the temperature data. I’m the one saying we don’t know the way it is, and until we do it’s foolish to ascribe it to some mythical “approximately sixty year” cycles …
w.
PS—A fully worked out example, showing the “differing slopes” and how they change a pure cosine wave into the shape shown by the HadCRUT4 data would advance your cause immensely … remember, according to you, you can do it using only a single pure cosine wave plus “differing slopes behind it”. Note that you need to have upswings of 18 years, 40 years, and 41 years. You also need to have downswings of 29 years in one case and 19 years in the other.
I say you can’t do that with only the things you described, a single pure cosine wave plus differing slopes. I say you’re just waving your hands and making things up in an unsuccessful attempt to show that I’m wrong.
But hey, I’m willing to be convinced … break out the pure cosine wave and the differing slopes and show us how it’s done!

steveta_uk
March 17, 2014 3:12 am

Richard, can you design a filter to remove comments from the people who don’t believe the data in the first place?
Why they bother commenting is a mystery to me ;(

RichardLH
March 17, 2014 3:32 am

davidmhoffer says:
March 16, 2014 at 7:34 pm
“RichardLH
Pulsed input is also integrated by matter as well so I will still differ.
>>>>>>>>>>>>>>>.
It most certainly is not.”
It most certainly is. Both inputs and outputs are. That was you original point you made and I still dispute your claim.
“The tropics are net absorbers of energy, the arctic regions are net emitters. Massive amounts of energy are moved from tropics to arctic regions by means of air and ocean currents, and these are not integrated by matter as they would be if the sole mechanism was conductance.”
Mass transport, phase change and the like are different beasts. They have their own rules but none of them alter what I said.
“You’ve got an interesting way of looking at the data, just apply it to data that is meaningful.”
Thank you, I do.

D. Cohen
March 17, 2014 3:33 am

What happens when you apply the same filters to the sunspot data over the last 150 or so years?

RichardLH
March 17, 2014 3:42 am

Stephen Rasey says:
March 16, 2014 at 7:45 pm
“A beautiful example of frequency content that I expect to be found in millennial scale uncut temperature records is found in Lui-2011 Fig. 2.”
Ah – the power spectrum graphs.
“Lui sees a minor 60 (or 70) year cycle. But it seems less significant that the others.”
The problem with all these sort of studies is the same as with FFTs and Wavelets (see above). Great if you have noise free data. The more the noise you have, the less the usefulness.
Also that point about ‘half wave’ mixing applies. Nature rarely does things with full sine waves (unless it is a tuned string or the like). Most stuff is a lot more complex and very, very noisy/erratic.
Stuff comes and goes into than noise and makes it very difficult to see reliably. Proxy data is usually worse as each one adds its own variety of noise to pollute the answer.
So with a long proxy record there are some with the resolution required to see a ~60 year record. Shen is one ftp://ftp.ncdc.noaa.gov/pub/data/paleo/historical/pacific/pdo-shen2006.txt which is a rainfall re-construction of the PDO. http://climatedatablog.wordpress.com/pdo/
I am always looking for other with the required resolution. That is one of the points of this thread.

RichardLH
March 17, 2014 3:45 am

Bernie Hutchins says:
March 16, 2014 at 7:46 pm
“Unless you have a physical argument of other evidence, curve fitting is near useless. And the physics – let’s call it “incomplete”. Again, data fit to a curve is not THE data. ”
But this is most definitely NOT a curve fitting exercise! Indeed this is the exact opposite. If the curve the data draws by low pass filtering does not match your theory then your theory is wrong. This helps (I hope) in that endeavour.

RichardLH
March 17, 2014 3:47 am

John West says:
March 16, 2014 at 8:17 pm
“Sorry guys but this is absolutely worthless. ”
IYHO presumably. This does allow quite a detailed look at the temperature series that are available. It allows for meaningful comparisons between those series.

RichardLH
March 17, 2014 3:50 am

DocMartyn says:
March 16, 2014 at 8:50 pm
“Richard, does the fact that months are of unequal length screw the plot.”
True I adjust them all to a 1/12 spacing (these are really tiny bar graphs with a width of the sample rate) so there is some jitter associated with that. I don’t think it will affect the outcome in any detectable way.

RichardLH
March 17, 2014 3:57 am

Richard says:
March 16, 2014 at 9:18 pm
“I have had an opportunity to look at your spreadsheet, btw, thanks for sharing it.”
Thank you.
“I have some observations and questions about the methods used.
1. You use three non causal cascaded FIR filters, with 8, 10, and 12 taps respectively.”
Correct
” 2. These are just plain averaged filters without any weighting.”
Correct
” 3. While doing this will provide you a pseudo Gaussian response, why not use some weighting parameters for a Gaussian response?”
Because creating a function that provides a true Gaussian Kernel and then iterating that kernel over the input doesn’t provide any more accuracy. Occam’s Razor really.
“4. By using even number taps you are weighting the averages to one side or the other in the time series, and this will affect your phase relationship.”
True but for 12 months there is no choice. The spreadsheet does take account of this as the middle column is one row higher to help reduce this. In the end it is only a small phase shift which you can correct for if you believe it is required.
” 5. Were any windowing corrections considered?”
The 1.2067 inter-stage multiplier/divider corrects all/most of the square wave sampling errors if that is what you are asking.
” 6. Were there any Fourier analysis performed on the data before and after filtering?”
No.
“I use filters on data all of the time in my work, and am always concerned when someone does filtering and fails to mention Fco, Order, and Filtering Function (i.e. Bessel, Butterworth, Gaussian, etc).”
As I said, a CTRM is a near (very, very near) Gaussian function.
For frequency responses see Greg’s work
http://climatedatablog.files.wordpress.com/2014/02/fig-2-low-pass-gaussian-ctrm-compare.png
http://climatedatablog.files.wordpress.com/2014/02/fig-1-gaussian-simple-mean-frequency-plots.png

RichardLH
March 17, 2014 4:00 am

Bernie Hutchins says:
March 16, 2014 at 9:56 pm
“Regarding our Savitzky-Golay Discussion: Here is a less cryptic but just single page outline of the SG.m program (first 20 trivial lines at most!) ”
Well R already contains a Savitzky-Golay which I use in a mutli-pass form
http://climatedatablog.wordpress.com/2014/03/15/r-code-for-simple-rss-graph/


#”I ran a 5 pass-multipass with second order polynomials on 15 year data windows as per the Savitzky-Golay method.” Nate Drake PhD
SavitzkyGolay <- function(data, period=12)
{
f1 = period * 2 + 1
SavitzkyGolay = signal::sgolayfilt(data,n=f1)
SavitzkyGolay = signal::sgolayfilt(SavitzkyGolay,n=f1)
SavitzkyGolay = signal::sgolayfilt(SavitzkyGolay,n=f1)
SavitzkyGolay = signal::sgolayfilt(SavitzkyGolay,n=f1)
SavitzkyGolay = signal::sgolayfilt(SavitzkyGolay,n=f1)
}

RichardLH
March 17, 2014 4:14 am

Willis Eschenbach says:
March 17, 2014 at 12:22 am
“First, Richard, thanks for your work. Also, kudos for the R code, helped immensely.”
Thank you.
“My first question regarding the filter is … why a new filter? What defect in the existing filters are you trying to solve?”
Simplicity and accuracy.
“Mmmm … if that’s the only advantage, I’d be hesitant. I haven’t run the numbers but it sounds like for all practical purposes they would be about identical if you choose the width of the gaussian filter to match … hang on … OK, here’s a look at your filter versus a gaussian filter:…As you can see, the two are so similar that you cannot even see your filter underneath the gaussian filter … so I repeat my question. Why do we need a new filter that is indistiguishable from a gaussian filter?”
Actually it is just slightly better than a Gaussian. It completely removes the 12 month cycle rather than leaving a small sliver of that still in the output.
http://climatedatablog.files.wordpress.com/2014/02/fig-2-low-pass-gaussian-ctrm-compare.png
“There is indeed a “wiggle” in the data, which incidentally is a great word to describe the curve. It is a grave mistake, however, to assume or assert that said wiggle has a frequency or a cycle length or a phase.”
The choice of words was because I know I can’t prove a ‘cycle’ with what we have. Doesn’t mean you cannot observe what is there though.
“Let me show you why, using your data: The blue dashed vertical lines show the troughs of the wiggles. The red dashed vertical lines show the peaks of the wiggles. As tempting as it may be to read a “cycle” into it, there is no “~ 60 year cycle”. It’s just a wiggle. Look at the variation in the lengths of the rising parts of the wiggle—18 years, 40 years, and 41 years. The same is true of the falling parts of the wiggle. They are 29 years in one case and 19 years in the other. Nothing even resembling regular.”
Hmmm. I would question your choice of inflexion points. To do it properly it would probably be best to de-trend the curve first with the greater than 75 years line (not a straight line!) to get the central crossing points and then do any measurements. Peak to Peak is always subject to outliers so is usually regarded as less diagnostic. But as there are only two cycles this is all probably moot anyway. If there is anything else mixed in with this other than pure sine waves then all bets are off for both period, phase and wave shape.
I just display what is there and see where it goes.
“The problem with nature is that you’ll have what looks like a regular cycle … but then at some point, it fades out and is replaced by some other cycle. ”
The interesting thing is when you do comparisons to some proxy data with the required resolution.
http://climatedatablog.files.wordpress.com/2014/02/pdo-reconstruction-1470-1998-shen-2006-with-gaussian-low-pass-30-and-75-year-filters-and-hadcrut-overlay.png
Then out pops some possible correlation that does need addressing.
“To sum up, you are correct that “what you can’t do is say the wriggle is not there”. It is there. However it is not a cycle. It is a wiggle, from which we can conclude … well … nothing, particularly about the future. ”
Well the 15 year S-G trend says the immediate future is downwards. You could conclude that, if the ~60 year ‘cycle’ repeats, then the downwards section is going to be 30 years long. Time alone will tell.

tadchem
March 17, 2014 4:18 am

I have used Fourier analysis successfully to detect periodic and pseudo-periodic effects on data. When viewed in the frequency space, the peaks in the Fourier Transform identify such effects, while shallows, regions in the transform space where peaks are notably absent, identify ‘sweet spots’ where the cutoffs for filters can be tuned.
When applied to sunspot numbers. for example, there is a group of peaks corresponding to 11-13 year periods and a secondary group at about a 100 year period, but relatively little amplitude in between.
That is about the limit of applicability of the Fourier transform, however. The periodicity inherent in the mathematics makes it useless for extrapolations or for dealing with secular trends.This will allow optimization of tuned filters, however.

March 17, 2014 4:19 am

RichardLH;
It most certainly is.
>>>>>>>>>>>>>>>>>>>>>
How sad that both warmists and skeptics have become so obsessed with the measurement of average temperature and the analysis of it that we’ve completely forgotten what the original theory was that we were trying to prove or disprove. The theory is that doubling of CO2 increases downward LW by 3.7 w/m2. So, to determine if the theory is correct or not, we run right out and analyze a metric that has no linear relationship to w/m2 at all, and rationalize that somehow this is OK because the earth integrates it out for us.
One of the first things one ought to learn about understanding a problem is to use data that is as closely related to the problem as possible. Temperature is one order of data removed from the actual problem. Supposing that it is “integrated” is just such a supposition. It cannot be verified except by looking at the root data first and see if that theory holds. If we designed planes and bridges using second order data, there’s be a lot of engineers with their @ss on the line trying to explain why they did such an incredibly stupid thing and wound up responsible for the deaths of so many people.
But this is climate science so wild @ss guesses based on 2nd order data are OK for both advancing and refuting the theory.
pfffffft.

RichardLH
March 17, 2014 4:21 am

Greg Goodman says:
March 17, 2014 at 1:48 am
“Nice article Richard.
Once criticism is the 75y RM . Having roundly accepted how bad and distorting these damn things are you still put one in there. And what is shows is not that much use even if we could believe it. ”
Thanks. You and Vaughan got me into using the correct 1.2067 value in the first place.
The single mean for the 75 year is Hobson’s choice really. The data is just not long enough for a full triple pass. Most of the error terms fall outside the pass band hopefully. I could use a 75 year S-G and, if I were to do any de-trending work, that is what I would probably use.
What it does show quite clearly that all the ‘cycle’ is gone at 75 years which was the main point. Whatever is there, it lies between 15 and 75 years and looks to be ~60 years long.
“Great article though. The traffic on WUWT should be helpful in getting greater awareness of the defects and distortions of runny means.”
Thanks again. Single means are just SO bad.

RichardLH
March 17, 2014 4:23 am

Kelvin vaughan says:
March 17, 2014 at 1:59 am
“Filter it enough and you will end up with a straight line showing a slow rise in temperature.”
Actually you don’t. You end up with the greater than 75 year curve which is at the limit of what the data shows.

1 3 4 5 6 7 15