Guest Post by Willis Eschenbach
Loehle and Scafetta recently posted a piece on decomposing the HadCRUT3 temperature record into a couple of component cycles plus a trend. I disagreed with their analysis on a variety of grounds. In the process, I was reminded of work I had done a few years ago using what is called “Periodicity Analysis” (PDF).
A couple of centuries ago, a gentleman named Fourier showed that any signal could be uniquely decomposed into a number of sine waves with different periods. Fourier analysis has been a mainstay analytical tool since that time. It allows us to detect any underlying regular sinusoidal cycles in a chaotic signal.
Figure 1. Joseph Fourier, looking like the world’s happiest mathematician
While Fourier analysis is very useful, it has a few shortcomings. First, it can only extract sinusoidal signals. Second, although it has good resolution as short timescales, it has poor resolution at the longer timescales. For many kinds of cyclical analysis, I prefer periodicity analysis.
So how does periodicity analysis work? The citation above gives a very technical description of the process, and it’s where I learned how to do periodicity analysis. Let me attempt to give a simpler description, although I recommend the citation for mathematicians.
Periodicity analysis breaks down a signal into cycles, but not sinusoidal cycles. It does so by directly averaging the data itself, so that it shows the actual cycles rather than theoretical cycles.
For example, suppose that we want to find the actual cycle of length two in a given dataset. We can do it by numbering the data points in order, and then dividing them into odd- and even-numbered data points. If we average all of the odd data points, and we average all of the even data, it will give us the average cycle of length two in the data. Here is what we get when we apply that procedure to the HadCRUT3 dataset:
Figure 2. Periodicity in the HadCRUT3 global surface temperature dataset, with a cycle length of 2. The cycle has been extended to be as long as the original dataset.
As you might imagine for a cycle of length 2, it is a simple zigzag. The amplitude is quite small, only plus/minus a hundredth of a degree. So we can conclude that there is only a tiny cycle of length two in the HadCRUT3.
Next, here is the same analysis, but with a cycle length of four. To do the analysis, we number the dataset in order with a cycle of four, i.e. “1, 2, 3, 4, 1, 2, 3, 4, 1, 2, 3, 4 …”
Then we average all the “ones” together, and all of the twos and the threes and the fours. When we plot these out, we see the following pattern:
Figure 3. Periodicity in the HadCRUT3 global surface temperature dataset, with a cycle length of 4. The cycle has been extended to be as long as the original dataset.
As I mentioned above, we are not reducing the dataset to sinusoidal (sine wave shaped) cycles. Instead, we are determining the actual cycles in the dataset. This becomes more evident when we look at say the twenty year cycle:
Figure 4. Periodicity in the HadCRUT3 dataset, with a cycle length of 20. The cycle has been extended to be as long as the original dataset.
Note that the actual 20 year cycle is not sinusoidal. Instead, it rises quite sharply, and then decays slowly.
Now, as you can see from the three examples above, the amplitudes of the various length cycles are quite different. If we set the mean (average) of the original data to zero, we can measure the power in the cyclical underlying signals as the sum of the absolute values of the signal data. It is useful to compare this power value to the total power in the original signal. If we do this at all possible frequencies, we get a graph of the strength of each of the underlying cycles.
For example, suppose we are looking at a simple sine wave with a period of 24 years. Figure 5 shows the sine wave, along with periodicity analysis in blue showing the power in each of the various length cycles:
Figure 5. A sine wave, along with the periodicity analysis of all cycles up to half the length of the dataset.
Looking at Figure 5, we can see one clear difference between Fourier analysis and periodicity analysis — the periodicity analysis shows peaks at 24, 48, and 72 years, while a Fourier analysis of the same data would only show the 24-year cycle. Of course, the apparent 48 and 72 year peaks are merely a result of the 24 year cycle. Note also that the shortest length peak (24 years) is sharper than the longest length (72-year) peak. This is because there are fewer data points to measure and average when we are dealing with longer time spans, so the sharp peaks tend to broaden with increasing cycle length.
To move to a more interesting example relevant to the Loehle/Scafetta paper, consider the barycentric cycle of the sun. The sun rotates around the center of mass of the solar system. As it rotates, it speeds up and slows down because of the varying pull of the planets. What are the underlying cycles?
We can use periodicity analysis to find the cycles that have the most effect on the barycentric velocity. Figure 6 shows the process, step by step:
Figure 6. Periodicity analysis of the annual barycentric velocity data.
The top row shows the barycentric data on the left, along with the amount of power in cycles of various lengths on the right in blue. The periodicity diagram at the top right shows that the overwhelming majority of the power in the barycentric data comes from a ~20 year cycle. It also demonstrates what we saw above, the spreading of the peaks of the signal at longer time periods because of the decreasing amount of data.
The second row left panel shows the signal that is left once we subtract out the 20-year cycle from the barycentric data. The periodicity diagram on the second row right shows that after we remove the 20-year cycle, the maximum amount of power is in the 83 year cycle. So as before, we remove that 83-year cycle.
Once that is done, the third row right panel shows that there is a clear 19-year cycle (visible as peaks at 19, 38, 57, and 76 years. This cycle may be a result of the fact that the “20-year cycle” is actually slightly less than 20 years). When that 19-year cycle is removed, there is a 13-year cycle visible at 13, 26, 39 years etc. And once that 13-year cycle is removed … well, there’s not much left at all.
The bottom left panel shows the original barycentric data in black, and the reconstruction made by adding just these four cycles of different lengths is shown in blue. As you can see, these four cycles are sufficient to reconstruct the barycentric data quite closely. This shows that we’ve done a valid deconstruction of the original data.
Now, what does all of this have to do with the Loehle/Scafetta paper? Well, two things. First, in the discussion on that thread I had said that I thought that the 60 year cycle that Loehle/Scafetta said was in the barycentric data was very weak. As the analysis above shows, the barycentric data does not have any kind of strong 60-year underlying cycle. Loehle/Scafetta claimed that there were ~ 20-year and ~ 60-year cycles in both the solar barycentric data and the surface temperature data. I find no such 60-year cycle in the barycentric data.
However, that’s not what I set out to investigate. I started all of this because I thought that the analysis of random red-noise datasets might show spurious cycles. So I made up some random red-noise datasets the same length as the HadCRUT3 annual temperature records (158 years), and I checked to see if they contained what look like cycles.
A “red-noise” dataset is one which is “auto-correlated”. In a temperature dataset, auto-correlated means that todays temperature depends in part on yesterday’s temperature. One kind of red-noise data is created by what are called “ARMA” processes. “AR” stands for “auto-regressive”, and “MA” stands for “moving average”. This kind of random noise is very similar observational datasets such as the HadCRUT3 dataset.
So, I made up a couple dozen random ARMA “pseudo-temperature” datasets using the AR and MA values calculated from the HadCRUT3 dataset, and I ran a periodicity analysis on each of the pseudo-temperature datasets to see what kinds of cycles they contained. Figure 6 shows eight of the two dozen random pseudo-temperature datasets in black, along with the corresponding periodicity analysis of the power in various cycles in blue to the right of the graph of the dataset:
Figure 6. Pseudo-temperature datasets (black lines) and their associated periodicity (blue circles). All pseudo-temperature datasets have been detrended.
Note that all of these pseudo-temperature datasets have some kind of apparent underlying cycles, as shown by the peaks in the periodicity analyses in blue on the right. But because they are purely random data, these are only pseudo-cycles, not real underlying cycles. Despite being clearly visible in the data and in the periodicity analyses, the cycles are an artifact of the auto-correlation of the datasets.
So for example random set 1 shows a strong cycle of about 42 years. Random set 6 shows two strong cycles, of about 38 and 65 years. Random set 17 shows a strong ~ 45-year cycle, and a weaker cycle around 20 years or so. We see this same pattern in all eight of the pseudo-temperature datasets, with random set 20 having cycles at 22 and 44 years, and random set 21 having a 60-year cycle and weak smaller cycles.
That is the main problem with the Loehle/Scafetta paper. While they do in fact find cycles in the HadCRUT3 data, the cycles are neither stronger nor more apparent than the cycles in the random datasets above. In other words, there is no indication at all that the HadCRUT3 dataset has any kind of significant multi-decadal cycles.
How do I know that?
Well, one of the datasets shown in Figure 6 above is actually not a random dataset. It is the HadCRUT3 surface temperature dataset itself … and it is indistinguishable from the truly random datasets in terms of its underlying cycles. All of them have visible cycles, it’s true, in some cases strong cycles … but they don’t mean anything.
w.
APPENDIX:
I did the work in the R computer language. Here’s the code, giving the “periods” function which does the periodicity function calculations. I’m not that fluent in R, it’s about the eighth computer language I’ve learned, so it might be kinda klutzy.
#FUNCTIONS
PI=4*atan(1) # value of pi
dsin=function(x) sin(PI*x/180) # sine function for degrees
regb =function(x) {lm(x~c(1:length(x)))[[1]][[1]]} #gives the intercept of the trend line
regm =function(x) {lm(x~c(1:length(x)))[[1]][[2]]} #gives the slope of the trend line
detrend = function(x){ #detrends a line
x-(regm(x)*c(1:length(x))+regb(x))
}
meanbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle means
rep(tapply(x,modline,mean),length.out=length(x))
}
countbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle number of datapoints N
rep(tapply(x,modline,length),length.out=length(x))
}
sdbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle standard deviations
rep(tapply(x,modline,sd),length.out=length(x))
}
normmatrix=function(x) sum(abs(x)) #returns the norm of the dataset, which is proportional to the power in the signal
# Function “periods” (below) is the main function that calculates the percentage of power in each of the cycles. It takes as input the data being analyzed (inputx). It displays the strength of each cycle. It returns a list of the power of the cycles (vals), along with the means (means), numner of datapoints N (count), and standard deviations (sds).
# There’s probably an easier way to do this, I’ve used a brute force method. It’s slow on big datasets
periods=function(inputx,detrendit=TRUE,doplot=TRUE,val_lim=1/2) {
x=inputx
if (detrendit==TRUE) x=detrend(as.vector(inputx))
xlen=length(x)
modmatrix=matrix(NA, xlen,xlen)
modmatrix=matrix(mod((col(modmatrix)-1),row(modmatrix)),xlen,xlen)
countmatrix=aperm(apply(modmatrix,1,countbyrow,x))
meanmatrix=aperm(apply(modmatrix,1,meanbyrow,x))
sdmatrix=aperm(apply(modmatrix,1,sdbyrow,x))
xpower=normmatrix(x)
powerlist=apply(meanmatrix,1,normmatrix)/xpower
plotlist=powerlist[1:(length(powerlist)*val_lim)]
if (doplot) plot(plotlist,ylim=c(0,1),ylab=”% of total power”,xlab=”Cycle Length (yrs)”,col=”blue”)
invisible(list(vals=powerlist,means=meanmatrix,count=countmatrix,sds=sdmatrix))
}
# /////////////////////////// END OF FUNCTIONS
# TEST
# each row in the values returned represents a different period length.
myreturn=periods(c(1,2,1,4,1,2,1,8,1,2,2,4,1,2,1,8,6,5))
myreturn$vals
myreturn$means
myreturn$sds
myreturn$count
#ARIMA pseudotemps
# note that they are standardized to a mean of zero and a standard deviation of 0.2546, which is the standard deviation of the HadCRUT3 dataset.
# each row is a pseudotemperature record
instances=24 # number of records
instlength=158 # length of each record
rand1=matrix(arima.sim(list(order=c(1,0,1), ar=.9673,ma=-.4591),
n=instances*instlength),instlength,instances) #create pseudotemps
pseudotemps =(rand1-mean(rand1))*.2546/sd(rand1)
# Periodicity analysis of simple sine wave
par(mfrow=c(1,2),mai=c(.8,.8,.2,.2)*.8,mgp=c(2,1,0)) # split window
sintest=dsin((0:157)*15)# sine function
plotx=sintest
plot(detrend(plotx)~c(1850:2007),type=”l”,ylab= “24 year sine wave”,xlab=”Year”)
myperiod=periods(plotx)
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Geoff Sharp says:
August 2, 2011 at 3:29 pm
I provided a clock face with variable trident arm as an analogy to demonstrate how cycles can go undetected using your method.
I don’t understand why you make this strident claim as those cycles are not undetected at all. Here they are [in red circle]: http://www.leif.org/research/FFT-Barycenter-Distance-170.png
They are [like the 60-yr cycles] completely insignificant, but they are there, as they should be.
Richard Saumarez says:
August 2, 2011 at 3:28 pm
“This is decimated ated 1/month, giving a Nyquist frequency at 1/(2months). Since this is aliased, the monthly record is corrupt.”
You’re absolutely correct. I’ve long been an advocate of SEMI-decimation, but very few understand the issue.
1sky1 says:
August 2, 2011 at 3:06 pm
You can’t have it both ways. Either it is useful in detecting strictly periodic signals from those that are not (it is, as you say), or, it provides no new analytic insight.
As to whether it “appeals strongly to primitive intuition”, I love the underlying mathematical class-based claim that some kinds of intuition are “primitive”. It also neglects the fact that we have “primitive intuitions” because historically they’ve worked …
I’m not sure what you mean by “mistakenly periodic”.
I find the lack of orthogonality to be a great advantage at times. Periodicity analysis shows me the actual shape and size of the real 20-year signal in the data. As far as I know it is the only method of its type that can do this. For example, as Figure 4 above shows, the 20 year cycle in the data consists of a slow decline in temperature, followed by a very quick warming. I find this quite interesting, as there are enough 20 year cycles in the data to give 8 data points each. That’s miles from ideal … but it gives the best idea available of the actual 20 cycle as opposed to some theoretical sine wave.
It is also capable of finding a simplified solution which is not apparent in Fourier analysis. For example, if we add a square wave and and triangle wave with periods of say 7 and 11, Fourier analysis will show the insanely complex pattern of sine waves that you’d need to add together to get that square plus triangle wave.
Periodicity analysis, on the other hand, reveals the exact shape (square, triangular, or whatever) of the underlying dominant patterns. Specifically because it is not orthogonal, it can reveal the shape and period of the square and triangle waves that make up the final result. This ability to examine and recognize patterns adds to our understanding of what we are discussing.
At least that’s my primitive intuition, your intuition may vary …
I’m not down on any of the methods, Fourier or wavelet or others. I’m pointing out that there is an additional method, periodicity analysis, which (like all methods) has its advantages and disadvantages. Unlike all the rest, it can reveal the actual shape of the intermediate repeating patterns that exist in the dataset. Among other things, it reinforces the idea that there are cycles at all frequencies, the only question is their strength.
I’m sorry if my writing is that unclear. I’m not dissing Joe Fourier, he’s one of my personal heros, his was a stupendous insight. I started this with a picture of Fourier, in homage to the man.
1sky1, you may have noted before I respond to someone, I post their exact words to which I am responding. This avoids misunderstandings on either side – if I am mistaken in what I understand from your words, we both have your original words to refer to. That way, my misunderstanding can be cleared up.
In this case you make a statement that I am “claiming” something, which you say is largely on the basis of some other thing, and that I’m concluding something else … my friend, I’ve written thousands of words on various subjects, including almost two thousand in the head post on this thread alone, never mind what I’ve written in the comments. What exactly did I say that you are responding to?
Without knowing exactly what I said, I don’t even know what you are objecting to. Without that, I don’t know where our mutual areas of understanding (which are great) began to diverge.
So everyone, please. QUOTE MY EXACT WORDS so we can all know what it is you are talking about.
Thanks,
w.
Willis,
The claim of pseudocycles is in the title of your post. Then in your comments
Willis Eschenbach says:
July 30, 2011 at 10:42 am
“For the Loehle/Scafetta paper to be valid, the cycles need to be valid. The actual 20 year cycle in the temperature data is extremely weak. Their claimed 60 year cycle in the temperature data cannot be determined to be real. ”
I have an upcoming appointment this evening, but I’ll find time tomorrow to locate your comment on how red noise explains multidecadal cycles and then discuss the attendant technical issues.
Geoff Sharp says: “Willis is losing a lot of creditability here.”
I don’t believe Willis’s credit score has been impacted by anything he’s written on this thread.
Geoff Sharp says: “So many ill founded attacks on those that disagree with the basic fabric of this thread.”
You are the only blogger I know who considers questions posed to you and requests of you to be attacks.
Geoff Sharp says: “There seems to be a trend lately that WUWT is promoting Luke Warmer resident guest authors on a permanent basis.”
I would have avoided commenting on this thread, but I accept this part of your comment to be directed at me as well. Here’s a suggestion, Geoff. Maybe you should create your own website and blog where you can have the guest authors of your choice.
I second Bob Tisdale’s post above. Willis is one of only a handful of people that I find 100% credible.
Leif Svalgaard says:
August 2, 2011 at 4:52 pm
Geoff Sharp says:
August 2, 2011 at 3:29 pm
I provided a clock face with variable trident arm as an analogy to demonstrate how cycles can go undetected using your method.
I don’t understand why you make this strident claim as those cycles are not undetected at all. Here they are [in red circle]: http://www.leif.org/research/FFT-Barycenter-Distance-170.png
They are [like the 60-yr cycles] completely insignificant, but they are there, as they should be.
This is the exact point I am trying to make Leif. Your analysis shows the 172 cycle as insignificant. But what does the actual data tell us? It tells us without any doubt there is a 172 year fluctuation (cycle) in distance (AM) that is far from insignificant. If the theory is correct is means the 172 year wave is responsible or at least correlates with solar cycle modulation, the same logic is being used by Scafetta and his 60 year cycle. I am saying undetected is basically the same as insignificant.
http://tinyurl.com/2dg9u22/images/Powerwave.png
Willis says “I find no such 60-year cycle in the barycentric data.”
I and Scafetta have shown there is a clear 60 cycle in the solar velocity record. We use different scientific methods to isolate that cycle.
One method of analysis shows little or no significance of a cycle and another method shows a very clear cycle that is fundamental to a scientific theory. Willis has put all his eggs in one basket without taking into consideration other methods. This is his decision, but it is also very bad science. I object to the title of this thread based on these reasons.
@richard Saumarez:
> I did my PhD in a World class signal processing laboratory …
Hmm, you claim to have a PhD in signal processing, yet you seem to be ignorant of Shannon’s sampling theorem. How can that be? In what area is your PhD?
Your objections were somewhat incoherent and in no way refute Shannon’s theorem. For example, I specifically stated the theorem only holds for band limited signals, where the Nyquist limit is observed. And of course signal averaging acts like a low-pass filter, so how does that refute Shannon’s theorem?
If you really had doctoral-level knowledge of signal processing you would certainly understand this notion of ‘perfect reconstruction’ of analog signals from their digitized samples.
Bob Tisdale and Smokey:
Above (at August 1, 2011 at 1:04 pm) I have already supported your position but I write to iterate my view by citing that post because I think it needs repeating here.
Richard
tallbloke says:
August 2, 2011 at 8:17 am
Hi Geoff, sorry I haven’t found the cable yet, I’ll have a look tonight. Your 110k year cycle for Jupiter sounds very interesting, and I’d like more info on that in return for digging out the plots. Over the 6000 years I looked at (annual datapoints), the 172 year signal was evident throughout, but was modulated on longer cycles too, just like the x-y data. Given the A/M exchange between (IIRC) Jupiter and Neptune at the Hallstadt cycle length, I think the precessions might be tied together, certainly for those planets, and quite likely for the others too, if perturbation theory is anywhere like right. But over the long term (millions of years) the solar system is chaotic, and some big events will occur which will change things drastically. However, the degree of order and synchronisation we see currently must come about somehow. It could be that there is a self organising principle at work which actively causes the planets to adopt as stable a pattern as possible after a disruptive event. I suspect it’s tied to, is influenced by and influences solar activity levels in a true cybernetic system of feedbacks.
Tying that down is much further down the line however.
Hi Rog, I am a bit confused with your response, it doesnt really answer my question. The precession issue with z axis mass above and below the solar equator that changes that mass over time is what i am trying to resolve. I did some work this morning using JPL, Semi’s Ephemerides Viewer and the Sky View Cafe website, and if my research is correct it shows precession of the planets moves the mass in the z axis over time.
It appears the actual orbit incline of the planets does not move over time (6000 years). The perihelion point moves with the precession along with the mass. I looked at Jupiter and it showed a movement of about 2 deg in the z axis per 6000 years which is significant. The planets all have their own inclination angle which can be in opposition to each other. If my figures are correct it means the z axis mass totals would not follow a repeatable pattern, especially when looking at longer timeframes. Your graph showing the correlation of z axis and SSN may just be a coincidence or it could also mean the solar cycle modulation does not follow a regular cycle, this is a fairly large difference between our two theory’s.
I would be interested to see if you come up with the same results.
Geoff Sharp says:
August 2, 2011 at 8:47 pm
This is the exact point I am trying to make Leif. Your analysis shows the 172 cycle as insignificant. But what does the actual data tell us? It tells us without any doubt there is a 172 year fluctuation (cycle) in distance (AM) that is far from insignificant.
It is just a minor perturbation and is indeed insignificant. That you attach significance to it is another matter.
If the theory is correct is means the 172 year wave is responsible or at least correlates with solar cycle modulation, the same logic is being used by Scafetta and his 60 year cycle. I am saying undetected is basically the same as insignificant.
But most likely theory is not correct as it is physically impossible.
Willis says “I find no such 60-year cycle in the barycentric data.”
I and Scafetta have shown there is a clear 60 cycle in the solar velocity record. We use different scientific methods to isolate that cycle.
There is a tiny cycle not a dominant one as in the very short temperature record
One method of analysis shows little or no significance of a cycle and another method shows a very clear cycle that is fundamental to a scientific theory.
There is no clear cycle, just a minor perturbation, and there is no scientific theory, just hand waving. Your perturbations are not even correlated with solar activity. The power spectrum of Steinhilber’s solar activity data http://www.leif.org/research/FFT-Steinhilber.png shows no prominent peak at 172 years, but lack of power at 170 and a peak at 177 years. The dominant cycle is the Hallstatt cycle at 2343 yrs and the Suess cycle at 208 yrs. The 86 yr peak is half of your 172 yrs. There is broad power around 120-130 yrs. It is clear that your 172 yr perturbation is not a major influence.
I object to the title of this thread based on these reasons.
so you disagree, but that is hardly grounds for rejection of his opinion.
Geoff Sharp says:
August 2, 2011 at 3:29 pm
Since you haven’t provided any citation for that claim, I assume you are referring to this:
So in short, you don’t have a clue what kind of equation would “capture the variability of the trident head phenomenon”. You don’t know how it is created. You don’t know how it escapes Fourier analysis as you claimed.
You have also provided a link to a chart of Ted Landscheidts … I discussed this stuff with Ted himself, Geoff. Unfortunately, he did what you are doing. You are noticing a phenomenon, that sometimes a combination of cycles ends up looking like a trident. You yourself admit you have no explanation for this supposed entity.
You say that the trident-head wave can go undetected because one or more of the three prongs of the trident may not show up. You don’t have a theory of how or when that might happen.
So no, I’m not “ignoring” the data as you claim. I find the data totally unexceptional. Sure, when you have varying waves you can easily get a beat frequency that looks like a trident. I could make you some if you like. Then you could play with them, and notice that the appearance of such a pattern is very common. You could also understand why under some circumstances such a beat frequency may not appear as a fundamental frequency in fourier analysis … because it’s not a fundamental frequency,
However, such a beat frequency will appear in periodicity analysis … just sayin …
You keep coming back to credibility. You are making some kind of claim that a phenomenon called a “trident headed wave” exists as a separate category of wave. You are unable to provide any theoretical or mathematical understanding beyond saying well, there really are trident headed waves, and your evidence is that you you can point to them.
I, on the other hand, have provided full accountability for the mathematical methods I used, including the code.
I’ll let the reader decide whose ideas are more credible here.
w.
nicola scafetta says:
July 30, 2011 at 11:06 am
To Willis Eschenbach,
I am sorry that I need to contradict Willis, his analysis is very poor.
Our analysis is based on the correct thecniques, that is “multiple” power spectrum analisis agaist red noise background. I would like to insist on the word “multiple” because I used three alternative methods. The quasi 20 and 60 year cycles are quite evident in the data. This tests are done in Scafetta 2010. In L&S 2011 we simply references those results
Moreover similar cycles have been found by numerous other people in numerous climatic data and published in numerous data. So, ther is very little to question.
Moreover the curves shown in figure 1,2,3,4 show equal cycles which are not sinusoisal, but are clearly equal.
See for example the above 20-year modulation shown in figure 4. The cycles are not sinusoidal, but they are still perfectly “equal”.
It is very unlikely that the temperature [would] present such a perfect repetition of cycles that would be possible only if the temperature were made of cycles with perfect periods of: 20, 10, 5, 4, 2.[years]
What Willis did is simply to calculate a single average cycle and then he plotted this same cycle many times in a consecutive way.
Try to use a sequence made of two cycles with period 20 and 15, then use your 20 period and you will see that your thecnique fails to properly reproduce the modulation of the curve.
I hope Willis is taking his time to consider Nicola Scafetta’s response, and is carefully formulating a well considered reply. Because if not he is just ignoring Scafetta’s criticism, having written a post which purports to rebut Loehle and Scafetta’s paper.
That Scafetta took the time to answer Willis should be regarded as a compliment, even though he is critical of the analysis technique employed to rebut the L & S 2011 paper.
Hack and run tactics will not enhance the reputation of open peer review. I think Willis owes it to the open science community as well as to Craig Loehle and Nicola Scafetta to respond to the criticism.
@John Day,
Yes, I did do a PhD in Electrical/Biomedical engineering in a signal processing laboratory. I am familiar with the Nyquist theorem that relates the sampling frequency necessary to define a signal and I am also familiar with Shannon’s theorem which relates signal entropy and spectral charateristics. (See Papoulis, Bendat & Piersol, Oppenheim & Schaeffer).
I agree that a signal, when filtered with anti-aliasing filtered and then sampled at, or above, the Nyquist frequency can be reconstructed with tolerable accuracy, although to do so one, theoretically, needs an infinite length of of samples obtained at infinitesimal resolution to do so. (Why? Because construction of a continuous signal from samples is simply convolution with a sin(x)/x waveform that is not bounded in time and has an asymptotic amplitude).
However, you have missed a very fundamental point. Many variables that are treated as signals in climatology cannot be treated with an anti-aliasing filter and so may be aliased. You clearly haven’t thought about my example: The CRUTEMP data, which is presented as monthly samples, is severely aliased, simply because the daily samples are effectively filtered by averaging with a frequency response that is the form of sin(w)/w (w=complex frequency) and this filter has its first zero at 1/month. The signal is then decimated at 1/month, giving a Nyquist frequency of 1/2months. Therefore the signal is aliased and the daily data cannot be reconstructed from the monthly data.,
I suggest that you take the CRUTEMP data, segment it into, say, 10 year records and compute the ensemble amplitude spectrum ( having applied a cosine bell window and detrended the record). You will see immediately that the data is aliased. You might also like to take a well-sampled signal, obtain averages samples, and decimate the signal at a period equal to the length of the averaging window.
Does this matter? It depends on what you want to do with the data. As you must be aware, the effect of aliasing in the frequency domain is to fold the spectrum, which is the true spectrum convolved with sampling process, an infinite sequence of impulses speced at the sampling frequency. Therefore, the high frequency components in the signal appear as spurious lower frequency signals in the aliased spectrum, leading to trends in the time-domain signal. If this is used as an input, say, to a model, it will produce spurious results.
I stand by my comment that ignoring the Nyquist theorem, or at least not being aware of its finer points, is depressingly common and, in my view, stems from a mindless application of DSP techniques without sufficient analysis.
As a follow-up to my last remark, the thesis presented in this post has been constructed using aliased data and therefore the results are simply wrong.
One advantage of spectral analysis is that it is relatively easy to determine if data is aliased and this is a good first step to apply to data before applying further signal processing techniques.
Willis Eschenbach says:
August 2, 2011 at 11:01 pm
So in short, you don’t have a clue what kind of equation would “capture the variability of the trident head phenomenon”. You don’t know how it is created. You don’t know how it escapes Fourier analysis as you claimed.
Incorrect. An equation cant capture the variability, its like saying how powerful will the swell of the ocean be in 1001 days. The infinite variability of the planet positions govern the strength of the prong and how many. If you can write an equation for that I will be impressed. I know exactly how it is created and have provided the data to back it up, you simply dont understand it. The trident example was meant to convey to you how I do understand why the process of Fourier analysis are not picking up a regular signal, that is because it is not regular. Leif points out the De Vries or Suess cycle, that is a product of the trident. One of the prongs is weak so the gap increases. Look at the Dalton minimum and now, 210 years between grand minima, the last prong of the Dalton didnt fire and the first prong (SC20) of the current cycle was the same. One day you guys will get it.
You have also provided a link to a chart of Ted Landsc***ts … I discussed this stuff with Ted himself, Geoff. Unfortunately, he did what you are doing. You are noticing a phenomenon, that sometimes a combination of cycles ends up looking like a trident. You yourself admit you have no explanation for this supposed entity.
Incorrect again, the charts I have linked to have nothing to do with Landsch**dt, you have no idea of what you are discussing. The chart mentioned (which chart?) was probably related to the famous chart created by the now deceased Carl Smith who we are indebted to. The new work that has sprung from his graph I think is ground braking, you have absolutely no knowledge on this topic. You need to bring yourself up to speed before you can criticize. My research has very little to do with the Landsche*** method.
You say that the trident-head wave can go undetected because one or more of the three prongs of the trident may not show up. You don’t have a theory of how or when that might happen.
Incorrect again. Read my paper, everything is answered if you bother to read.
So no, I’m not “ignoring” the data as you claim. I find the data totally unexceptional. Sure, when you have varying waves you can easily get a beat frequency that looks like a trident. I could make you some if you like. Then you could play with them, and notice that the appearance of such a pattern is very common. You could also understand why under some circumstances such a beat frequency may not appear as a fundamental frequency in fourier analysis … because it’s not a fundamental frequency,
However, such a beat frequency will appear in periodicity analysis … just sayin …
You havent even looked at the data…this is the problem. Read the paper.
This is not good enough from a frequent author on a science blog, you are doing this blog a disservice and in my mind have zero credibility. I will not waste any more time on you.
———————
You keep coming back to credibility. You are making some kind of claim that a phenomenon called a “trident headed wave” exists as a separate category of wave. You are unable to provide any theoretical or mathematical understanding beyond saying well, there really are trident headed waves, and your evidence is that you you can point to them.
Incorrect again (this is getting boring) The planet angles create the perturbations that cause solar slowdown that usually travel in threes. They are different everytime, this is how nature works and it is normal orbital physics, the data used is from JPL and beyond question. You have absolutely no understanding of the process and refuse to look at it. I will leave people to judge your credibility.
Richard Saumarez and John Day,
Please would you take a look at the DSP techniques employed in this post and leave some comment.
http://tallbloke.wordpress.com/2011/07/31/bart-modeling-the-historical-sunspot-record-from-planetary-periods/
I’d also appreciate any input you can give to the discussion about sampling rates for barycentric data towards the bottom of the comments in this thread too if you can spare the time
http://tallbloke.wordpress.com/2011/07/25/ed-fix-solar-activity-simulation-model-revealed/
Many thanks
Bob Tisdale says:
August 2, 2011 at 7:02 pm
I would have avoided commenting on this thread, but I accept this part of your comment to be directed at me as well. Here’s a suggestion, Geoff. Maybe you should create your own website and blog where you can have the guest authors of your choice.
I have two blogs Bob, this goes to show how incredibly out of touch you are…click on my name.
Leif Svalgaard says:
August 2, 2011 at 10:07 pm
I object to the title of this thread based on these reasons.
———————————————-
so you disagree, but that is hardly grounds for rejection of his opinion.
The title of this “story” is offensive and very clearly suggests that the work of Loehle and Scafetta is pseudo-science. All based on very weak science by the author. I would not be proud to have this story on my blogs, but do understand Anthony is a busy man and respect the work that he does.
tallbloke says:
August 3, 2011 at 12:15 am
Oh, please, tallbloke, enough with the veiled insults. If I had seen Scafetta’s response, I would have answered it. I have a lot on my plate, it’s four AM and I’m leaving for Alaska. I’ll get back to Nicola as soon as I can.
But for you to put out all that ‘Is Willis hiding from Nicola’ kind of innuendo is just nasty, tallbloke. A simple “did you see this” without spilling the ugly contents of your mental suspicion machine would have been sufficient, I could have said ‘no I missed it’ and gone on.
That kind of thing is far beneath you, tallbloke. I’m surprised, usually you use your indoor voice and play well with the other kids.
Back in a bit,
w.
Geoff Sharp:
As I have repeatedly said, I think the Lohle &Scaffetta method is flawed. But I do NOT consider it to be pseudo-science.
However, at August 3, 2011 at 3:39 am you say to Willis:
“The title of this “story” is offensive and very clearly suggests that the work of Loehle and Scafetta is pseudo-science.”
Hmmm. I had not thought the title made that suggestion, and I still don’t. Just saying.
I wonder if emotions are getting a little high in this discussion.
Richard
@richard Saumarez
I agree that a signal, when filtered with anti-aliasing filtered and then sampled at, or above, the Nyquist frequency can be reconstructed with tolerable accuracy,…
Apparently you do not know the sampling theorem.
But you’re making progress. You started with
‘A Fourier TRANSFORM cannot be made on a real signal.’
then
‘It can be approximated for a time windowed signal using a Discrete Fourier transform’
Now you allow reconstruction with ‘tolerable accuracy’.
When will you admit that the Shannon sampling theorem shows us, mathematically, that analog band-limited signals can be perfectly from digitized specimens of that signal? In the same sense, mathematically, that a continuous straight line may be reconstructed perfectly from only two samples.
Here are Claude Shannon’s own words [1] on this subject:
“Theorem 1: If a function f(t) contains no frequencies
higher than W cps, it is completely determined by giving
its ordinates at a series of points spaced 1/2W seconds
apart.
This is a fact which is common knowledge in the
communication art. The intuitive justification is that, if f(t)
contains no frequencies higher than W, it cannot change to
a substantially new value in a time less than one-half cycle
of the highest frequency, that is, 1/2 . A mathematical
proof showing that this is not only approximately, but
exactly, true can be given as follows. [snip]”
Your homework tonight is to read and understand this proof.
😐
[1] Claude Shannon, “Communication in the Presence of Noise”, Proc. Institute of Radio Engineers vol. 37 (1): 10–21, 1949.( http://www.stanford.edu/class/ee104/shannonpaper.pdf )
Willis Eschenbach says:
August 3, 2011 at 4:32 am
But for you to put out all that ‘Is Willis hiding from Nicola’ kind of innuendo is just nasty, tallbloke. A simple “did you see this” without spilling the ugly contents of your mental suspicion machine would have been sufficient, I could have said ‘no I missed it’ and gone on.
The lady doth protest too much. I referred to Scafetta’s reply to you in an answer to you three days ago. Mind you, checking back, you never replied to that either. It might explain why you know squat about solar system dynamics though, your arithmomania leads you to ignore responses which don’t contain numerical information couched in terms which can be handled by the technique you think is best.
Geoff Sharp says: “I have two blogs Bob, this goes to show how incredibly out of touch you are…click on my name.”
I’m aware of your website and blogs, Geoff. I visited your website twice–once a couple of years ago when I was researching the works of Theodor Landscheidt, and once within the past few months when I was trying to determine if you personally presented anything of value at your website.
With respect to my earlier comment, apparently I have to be more specific with you. The topic of discussion was authors of guest posts at WUWT. As you might recall, the basis of my earlier statement was your reply to Willis, “There seems to be a trend lately that WUWT is promoting Luke Warmer resident guest authors on a permanent basis.”
I’ll clarify my reply. Here’s a suggestion, Geoff. Maybe you should attempt to create and maintain a website and blog comparable to WUWT where you can have the guest authors of your choice and where those guest authors would want to have their posts presented.
It is interesting to see that you use the same debate tactics with Willis that you do with me (and continue to use with Leif). When you are presented with data-based reality, you offer conjecture in an attempt to redirect the discussion. When you are incapable of responding to a question or request for information, you misdirect and you attack the person asking the question or making the request. You’re predictable. Any reader can scroll through the recent threads where you’ve argued and see how you repeat the same tiresome and time-wasting tactics.
Willis has not lost any credibility on this thread, Geoff, but you are incapable of restoring yours.
Geoff Sharp says:
August 3, 2011 at 1:17 am
the process of Fourier analysis are not picking up a regular signal, that is because it is not regular. Leif points out the De Vries or Suess cycle, that is a product of the trident. One of the prongs is weak so the gap increases.
The Fourier analysis of the barycenter data does pick up the 172 yr wave and no 208 yr wave. The solar data does not have any 172 yr signal, but a strong 208 yr signal. And there is no correlation between the tridents and grand minima.