The Sea Level Cycles Get More Elusive

Guest Post by Willis Eschenbach

In my last post on the purported existence of the elusive ~60-year cycle in sea levels as claimed in the recent paper “Is there a 60-year oscillation in global mean sea level?”, I used a tool called “periodicity analysis” (discussed here) to investigate cycles in the sea level. However, some people said I wasn’t using the right tool for the job. And since I didn’t find the elusive 60-year cycle, I figured they might be right about periodicity analysis. In the process, however I found a more sensitive tool, which is to just fit a sine wave to the tidal data at each cycle length and measure the peak-to-peak amplitude of the best-fit sine wave. I call this procedure “sinusoidal periodicity”, for a simple reason—I’m a self-taught mathematician, so I don’t know the right name for the procedure. I’m sure this analysis method is known, but since I made it up I don’t know what it’s actually called.

I like to start with a look at the rawest view of the data. In this case, here’s the long-term Stockholm tide gauge record itself, before any further analysis. This is the longest complete monthly tidal gauge record I know of, at 200 years.

Stockholm Monthly Tidal Record 1801-2000Figure 1. Stockholm monthly average sea level. This is a relative sea level, measured against an arbitrary zero point.

As you can see, Stockholm is (geologically speaking) rapidly leaping upwards after the removal of the huge burden of ice and glaciers about 12,000 years ago. As a result, the relative sea level (ocean relative to the land) has been falling steadily for the last 200 years, at a surprisingly stable rate of about 4 mm per year.

In any case, here’s what the sinusoidal periodicity analysis looks like for the Stockholm tide data, both with and without the annual cycle:

sinusoidal periodicity stockholm

Figure 1a. “Sinusoidal Periodicity” of the Stockholm tide gauge data, showing the peak-to-peak amplitude (in millimetres) of the best-fit sine wave at each period length. Upper panel shows the data including the annual variations. In all cases, the underlying dataset is linearly detrended before sinusoidal periodicity analysis. Note the different scales of the two panels.

Now, I could get fond of this kind of sinusoidal analysis. To begin with, it shares one advantage of periodicity analysis, which is that the result is linear in period, rather than linear with frequency as is the case with Fourier transforms and spectral analysis. This means that from monthly data you get results in monthly increments of cycle length. Next, it outperforms periodicity analysis in respect of the removal of the short-period signals. As you can see above, unlike with periodicity analysis, removing the annual signal does not affect the results for the longer-term cycles. The longer cycles are totally unchanged by the removal of the annual cycle. Finally, I very much like the fact that the results are in the same units as the input data, which in this case is millimetres. I can intuitively get a sense of a 150-mm (6 inch) annual swing in the Stockholm sea level as shown above, or a 40 mm (1.5 inch) swing at both ~5.5 and ~31 years.

Let me start with a few comments on the Stockholm results above. The first one is that there is no significant power in the ~ 11-year period of the sunspot cycle, or the 22-year Hale solar cycle, as many people have claimed. There is a small peak at 21 years, but it is weak. After removal of the annual cycle, the next strongest cycles peak at ~5.5, 31.75, and 15 years.

Next, there are clearly cycle lengths which have very little power, such as 19.5, 26.5, and 35 years.

Finally, in this record I don’t see much sign of the proverbial ~60 cycle. In this record, at least, there isn’t much power in any of the longer cycles.

My tentative conclusion from the sinusoidal analysis of the Stockholm tide record is that we are looking at the resonant frequencies (and non-resonant frequencies) of the horizontal movement of the ocean within its surrounding basin.

So let me go through all of the datasets that are 120 years long or longer, using this tool, to see what we find.

So lets move on to the other 22 long-term tidal datasets that I linked to in my last post. I chose 120 years because I’m forced to use shorter datasets than I like. Normally, I wouldn’t consider results from a period less than three times the length of the cycle in question to be significant. However, there’s very few datasets that long, so the next step down is to require at least 120 years of data to look for a 60-year cycle. Less than that and you’re just fooling yourself. So without further ado, here are the strengths of the sinusoidal cycles for the first eight of the 22 datasets …

1-8 sinusoidal periodicity 22 long term tideFigure 2. Sinusoidal amplitude, first eight of the 22 long-term (>120 year) datasets in the PSMSL database. Note that the units are different in different panels.

The first thing that strikes me about these results? The incredible variety. A few examples. Brest has lots of power in the longer-term cycles, with a clear peak at ~65 years. Wismar 2, on the other hand, has very little power in the long-term cycles, but a clear cycle at ~ 28 years. San Francisco has a 55-year peak, but the strongest peak there is at 13 years. In New York, on the other hand, the ~51 year peak is the strongest cycle after the annual cycle. Cuxhaven 2 has a low spot between 55 and 65 years, as does Warnemunde 2, which goes to zero at about 56 years … go figure.

Confused yet? Here’s another eight …

9-16 sinusoidal periodicity 22 long term tideFigure 3. Sinusoidal periodicity, second eight of the 22 long-term (>120 year) datasets in the PSMSL database. Note that the units are different in different panels.

Again the unifying theme is the lack of a unifying theme. Vlissingen and Ijmuiden bottom out around 50 years. Helsinki has almost no power in the longer cycles, but the shorter cycles are up to 60 mm in amplitude.. Vlissingen is the reverse. The shorter cycles are down around 15-20 mm, and the longer cycles are up to 60 mm in amplitude. And so on … here’s the final group of six:

17-22 sinusoidal periodicity 22 long term tideFigure 4. Sinusoidal periodicity, final six of the 22 long-term (>120 year) datasets in the PSMSL database. Note that the units are different in different panels.

Still loads of differences. As I noted in my previous post, the only one of the datasets that showed a clear peak at ~55-years was Poti, and I find the same here. Marseilles, on the other hand, has power in the longer term, but without a clear peak. And the other four all bottom out somewhere between 50 and 70 years, no joy there.

In short, although I do think this method of analysis gives a better view, I still cannot find the elusive 60-year cycle. Here’s an overview of all 22 of the datasets, you tell me what you see:

all sinusoidal periodicity 22 long term tideFigure 5. Sinusoidal periodicity, all twenty-two of the long-term tide gauge datasets.

Now, I got started on this quest because of the statement in Abstract of the underlying study, viz:

We find that there is a significant oscillation with a period around 60-years in the majority of the tide gauges examined during the 20th Century …

(As an aside, waffle-words like “a period around 60-years” drive me spare. The period that they actually tested for was 55-years … so why not state that in the abstract? Whenever one of these good cycle-folk says “a period around” I know they are investigating the upper end of the stress-strain curve of veracity … but I digress.)

So they claim a 55-year cycle in “the majority of the tide gauges” … sorry, I’m still not seeing it. The Poti record in violet in Figure 5 is about the only tide gauge to show a significant 55-year peak.

On average (black line), for these tide gauge records, the strongest cycle is 6 years 4 months. There is another peak at 18 years 1 month. All of them have low spots at 12-14 years and at 24 years … and other than that, they have very little in common. In particular, there seems to be no common cycles longer than about thirty years or so.

So once again, I have to throw this out as an opportunity for those of you who think the authors were right and who believe that there IS a 55-year cycle “in the majority of the tide gauges”. Here’s your chance to prove me wrong, that’s the game of science. Note again that I’m not saying there is no 55-year signal in the tide data. I’m saying I’ve looked for it in a couple of different ways now, and gotten the same negative result.

I threw out this same opportunity in my last post on the subject … to date, nobody has shown such a cycle exists in the tide data. Oh, there are the usual number of people who also can’t find the signal, but who insist on telling me how smart they are and how stupid I am for not finding it. Despite that, so far, nobody has demonstrated the 55-year signal exists in a majority of the tide gauges.

So please, folks. Yes, I’m a self-taught scientist. And yes, I’ve never taken a class in signal analysis. I’ve only taken two college science classes in my life, Introductory Physics 101 and Introductory Chemistry 101. I freely admit I have little formal education.

But if you can’t find the 55-year signal either, then please don’t bother telling me how smart you are or listing all the mistakes you think I’m making. If you’re so smart, find the signal first. Then you can explain to me where I went wrong.

What’s next for me? Calculating the 95% CIs for the sinusoidal periodicity, including autocorrelation. And finding a way to calculate it faster, as usual optimization is slow, double optimization (phase and amplitude) is slower, and each analysis requires about a thousand such optimizations. It takes about 20 seconds on my machine, doable, but I’d like some faster method.

Best regards to each of you,

w.

As Always: Please quote the exact words that you disagree with, it avoids endless misunderstandings.

Also: Claims without substantiation get little traction here. Please provide links, citations, locations, observations and the like, it’s science after all. I’m tired of people popping up all breathless to tell us about something they read somewhere about what happened some unknown amount of time ago in some unspecified location … links and facts are your friend.

Data: All PSMSL stations in one large Excel file, All Tide Data.xlsx

Just the 22 longest stations as shown in Figs. 2-4 as a CSV text file, Tide Data 22 Longest.csv .

Stockholm data as an excel worksheet, eckman_2003_stockholm.xls 

Code: The function I wrote to do the analysis is called “sinepower”, available here. If that link doesn’t work for you, try here. The function doesn’t call any external functions or packages … but it’s slow. There’s a worked example at the end of the file, after the function definition, that imports the 22-station CSV file. Suggestions welcome.

 

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
144 Comments
Inline Feedbacks
View all comments
May 2, 2014 1:15 pm

the relative sea level (ocean relative to the land) has been falling steadily for the last 200 years, at a surprisingly stable rate of about 4 mm per year

To me, it looks like it has slowed substantially in the last decades. If we subtract the northern land rise, maybe we have some accelerating sea level rise after all? On a 50 to 100 year time scale that is.
/Jan

george e. smith
May 2, 2014 1:52 pm

In a related issue involving Fourier Transforms, I spend a lot of time and effort designing lenses (imaging) Most of the work is achieved with approximations (ray optics)
When the results are good enough (the designs that is), then it is time to switch to real physics based analysis, based on diffraction, and real wave based computations.
So to that end my very expensive canned software can give me MTF plots, based on FFT processes, or I can have it do the full nine yards Fraunhofer diffraction integrals. It just goes much more slowly.
So I normally will try both the FFT and the Lamborghini method, to see if they differ (much).
Some times, it seems that the FFT just makes up “stuff” that isn’t really there; and it depends a lot on how one sets up some variables in the process.
Do you guys always have full faith and credit, in your FFTs; and do you also have some no holds barred fallback mathematology, you can use to verify reality ??
g
[The mods wonder if your fastest (or most expensive) results are obtained from a Mercedes analysis, a Ferrari or the traditional Lamborghini method? Mod]

May 2, 2014 3:23 pm

The first one is that there is no significant power in the ~ 11-year period of the sunspot cycle
I also use an odd periodicity analyser (used 1801-2000 period for both data sets)
http://www.vukcevic.talktalk.net/SSL-SSN.htm
Perhaps ocean is to slow to respond to 11 year cycle, but it appears that it may not be for the 100+ years ‘cycle’.

Michael Gordon
May 2, 2014 3:26 pm

I suggest that the name of the technique is just a DFT, Discrete Fourier Transform.
I’ve never commented here but FFT/DFT and signals relate to my work in the Navy hunting submarines and in my commercial and amateur radio licenses so I see it from a different perspective and might add some little bit of light to this.
I see that Chris Clark has also identified it as a DFT.
The beauty of the DFT is that almost anyone can put it together with simple tools and it works with discontinuous data and data points that are not a multiple of a power of 2. It is easily comprehended and easily replicated. There’s nothing magical about it, unlike the FFT.
Theory of Operation
The idea is simple. The stream to be tested contains many different signals of varying frequency or period (take your pick). You basically GUESS at the possible existence of a sine wave, so you multiply each data point by a corresponding known point on your testing sine wave. Where they match you’ll get a positive result. When the test wave is +1, and the input signal contains a hidden sine wave of the same phase and period, it will also be +1 and the result is 1. When both waves go negative, they multiply and you still get +1. Every other wave in the stream being tested will have some other value less than 1, sometimes going negative and thus canceling any random positive.
Add them all together for one test pass and graph it as a dot on Y above your test frequency (or period) on X.
Increment the frequency (or period) of the test signal and run the data again.
What you get is the graphs shown above. Note that invariably the peaks widen as your period lengthens because the number of integrating cycles is reduced and random “pluses” are not always going to be cancelled by random “minuses” the fewer the test cycles.
(End of theory of operation).
WARNING: Phase relationship to the data stream is important. FFT gives both sine and cosine coefficients but DFT doesn’t unless you deliberately make it so. If your test just happens to be out of phase you’ll get a NULL instead of a PEAK at the expected period or frequency.
DFT is very good about detecting sinusoidal signals because that is what you test against.
BUT a phenomenon that happens at 60 year intervals could be simply the sum of a 20 year phenomenon and a 30 year phenomenon. DFT/FFT will show nothing at 60, but will show the 20 and the 30. It won’t show what happens when the 20 and the 30 align.
It can only show that there is, or is not, a *sinusoidal* phenomenon at some period. It could be almost anything else other than sinusoidal at 60. DFT/FFT will show such a thing as a family of harmonics. Harmonically related DFT/FFT output suggests non-linear phenomena whose period can be deduced from the harmonics.
An abrupt change in land subsidence or rise will produce a PHASE SHIFT on the signal but not change the PERIOD of the signal. However, during the rise the existence of this phase shift is indistinguishable from a change of frequency, a fact used by amateur radio VHF radios to implement FM (frequency modulation) by using phase modulation.
It will show up on DFT/FFT as a harmonic if any phase shifts have taken place, and the rate of the phase shift will produce a frequency (or period) spike in the output, or more precisely, a whole family of spikes depending on how abrupt is the shift.
In the frequency domain, harmonics are to the RIGHT of the fundamental, but in a period chart such as above, harmonics will be to the LEFT.
I see in the aggregated chart a peak at about 20 years and a peak at about 30 years. This is likely to produce a 60 year phenomenon that is simply the sum of these two coming into alignment but won’t show up on a periodic DFT.

george e. smith
May 2, 2014 4:37 pm

“””””…..[The mods wonder if your fastest (or most expensive) results are obtained from a Mercedes analysis, a Ferrari or the traditional Lamborghini method? Mod]…….””””””
Well the Muddling Mercedes Method; gets me there in style, and the Fast Ferrari Transport has much better sound effects; but for sheer ecstasy, the Lamborghini Lift-off Leaves a Lot of Lemmings in the Lurch.
Who’d ever expect to find all that wonderment, hidden in a mundane computer mouse ??
It’s actually a very serious problem, to design a 2.0 mm focal length “Macro” camera lens (1:1) that focuses in a definite plane, but is deliberately very fuzzy, so that it cannot out resolve (Nyquistly) the very low resolution CMOS camera sensor; or else the cursor may decide to move backwards.
I start with the very sharpest lens, I can design, and then I add a very clever (I think so) anti-aliasing filter surface, integral with the lens (bloody invisible too), which retains the original focal plane, but now gives a Gaussian Waist like beam spot, whose size I can control by design, so it’s bigger than the pixels of the sensor chip. My first AA filter designs, were actually Tchebychev filters of a kind; but they gave spurious responses in the stop band, under certain conditions; which was revealed by the Fast Ferrari Transport.
The best (current) ones are more of a Bessel response, and that gives very low spurs in the stop band. And that’s when I have to put the Nitro in the Lamborghini. Those are very tricky, since the profile is not defined by any closed form functions. But it is defined by an integrated set of related functions, each of which produces point to point and slope continuous half cycles of a cubic polynomial function; maybe up to a dozen different functions in the family.
That makes it both tricky and time consuming so set up the model, and even longer (multi hours) to run the Lamborghini around the course.(for maybe a hundred million laps).

Michael Gordon
May 2, 2014 5:25 pm

UPDATE: Mr. Willis Eschenbach advises elsewhere that he is NOT using FFT/DFT but something specifically tuned to periodic analyses. Also I suspect while I accurately described a technique for signal analysis it might not be the DFT. Still, the code he uses does indeed multiply a calculated sine wave of incremented period by the data points to amplify that particular period (wavelength) if present.
The implementation code includes this at the heart:
sum((sin((seq_along(tdata)-1)*2*pi/whichline+x[2])*x[1]-(tdata-mean(tdata,na.rm=TRUE)))^2,na.rm=TRUE)
Since the implementation uses the sin() function I suspect it probably is vulnerable to phase shift nulls and this could be discovered using cos() instead of sin() and see what happens.
As I read it: For each integer in the range 1 to the number of data elements provided, multiply by 2pi (radians) then divide the resulting circle into as many parts as which line you are processing and for each such part, multiply by a normalized value in the table.
On the first pass the divisor is small and so the sine wave is going to be short and rough. If the data is monthly, a good technique is to just start with “12” — achieving the stated goal of ignoring variations less than a year in length but NOT ignoring those data points. By the time you reach higher values, the sine wave is going to be very long period with hundreds or thousands of sample points.
It looks a lot like a Sine DFT to me but like I say, I might be mistaken that this is a DFT. At any rate, it is effective, understandable and hence persuasive.
http://www.moyhu.org.s3.amazonaws.com/misc/fourier/tides/tides.html Moyhu’s implementation uses fft directly.

Shawnhet
May 2, 2014 6:48 pm

Willis Eschenbach says:
May 1, 2014 at 11:52 pm
“You see, this is why I asked you to quote what you disagree with. Obviously, you think someone here is “hanging their hat on sea levels as the only useful metric” for detecting climate cycles.
However, I know of absolutely no one who is making that claim, and I know I’m not, so I don’t know who you think you are disagreeing with … but it ain’t me ”
My point is that even if you are correct in every particular above, it doesn’t really tell us anything. If you assume that these cycles are well represented in a variety of other proxies but not in sea levels, then there are two possibilities: either our sea level records are somehow flawed or the climate cycles are not well represented in the oceans for some reason. Also, it is worth noting that yours is not the only approach to detecting these cycles with sea level and some of them get different answers than yours.
http://judithcurry.com/2013/09/13/contribution-of-pdo-to-global-mean-sea-level-trends/
Cheers, 🙂

Chris Clark
May 3, 2014 10:11 am

Can I just stress that it is important to use sine AND cosine functions, not just one. Reason: if there is a pure sine wave in the data, and you multiply it by another wave of the same frequency and phase, a positive in one is always multiplied by a positive in another, likewise a negative and a negative. The result is a positive coefficient. If it is multiplied by a wave 180 degrees out of phase, a positive is always multiplied by a negative and v.v., and the sum is a negative coefficient. If it is multiplied by a wave of the same frequency but in quadrature (90 degree phase difference), the result is sometimes +ve and sometimes -ve and the sum is zero. The signal is missed. Using a cosine as well catches a signal whatever the phase. Apologies for spelling out what may already be obvious.

1sky1
May 3, 2014 11:40 am

There is no need to re-invent the wheel. Mathematicians have been fitting
sinusoids to periodic data for more than two centuries. Nowadays it’s
usually called “sinusoidal regression.” The “Willisgram” merely presents
the results of such curve fitting at increments of period dictated by the
data spacing. This can’t even identify precisely the period of the
best-fitting sinusoid in the record.
There is a crying need, however, for comprehension of what discrete
sinusoidal decomposition can and cannot accomplish in the general case,
where periodic components may have incommensurable periods and random
components have no strict period at all. That is the challenge facing the
analysis of geophysical data. The spectral decomposition provided by the
DFT smears all off-frequency periodic components into adjacent analysis
frequencies, whose increments are dictated by data spacing and by record
length. Meanwhile the random components are treated as if they repeat
periodically at the given record length; i.e., are highly complex periodic
wave-forms instead of random.
When the record is long enough to encompass scores of wave-forms of interest
and the noise level is low enough, these are not insurmountable challenges.
But multidecadal variations in climate records only a century long are a
different matter altogether, especially when there is no physical basis for
any expectation that they are periodic. Nor can zero-padding the record of a
continuing geophysical signal add any low-frequency information. Unlike the
periodic tides, they require different methods, akin to those used in
analysis of random ocean waves. Time allowing, I’ll report some results for
sea-level records next week.

Michael Gordon
May 3, 2014 8:49 pm

1sky1 says: “There is no need to re-invent the wheel. Mathematicians have been fitting
sinusoids to periodic data for more than two centuries.”
Alas, I am not one of those two-century old mathematicians. I *do* have to re-invent the wheel and sometimes get considerable joy doing so. If I succeed, I learn the underlying technique of how wheels are made, or in this case, detecting the presence of a sinusoidal signal in what otherwise looks like noise.
Naturally I am a little embarrassed when my big discovery turns out to have been discovered already countless times over the past 200 years. But to me it is a validation.
I believe value exists in the fact of independent discovery. If everyone on earth used exactly the same computer program to analyze something it is possible that the program itself is introducing artifacts. As many people as wish should be encouraged to “re-invent the wheel” and discover for themselves whatever there is to discover. Others might inspect the wheel — in this case, a computer program.

Michael Gordon
May 3, 2014 9:52 pm

1sky1 wrote: “results of such curve fitting at increments of period dictated by the data spacing. This can’t even identify precisely the period of the best-fitting sinusoid in the record.”
Agreed, but that appears not to be the purpose if this analysis, which was merely to confirm a 60 year period. I think we agree that an “about” 60 year in-phase sinusoidal phenomenon has not been detected.
But there’s a more sinister problem: Aliasing
Consider sampling a 28 day sine wave at 30 day intervals. At each sample, you are 2 days later into the cycle of the wave. Eventually your samples describe a perfect sine wave with a 14 month period — even though no such wave actually exists.
A typical source: http://redwood.berkeley.edu/bruno/npb261/aliasing.pdf

E.M.Smith
Editor
May 4, 2014 4:38 pm

18 years 1 month is very near one of the lunar / tidal cycles. 3 x that is 54 years 3 months. I’d speculate that folks finding a 55 year (ish) cycle are finding a harmonic of lunar / tidal activity.
https://chiefio.wordpress.com/2013/01/04/lunar-cycles-more-than-one/
https://chiefio.wordpress.com/2013/01/24/why-weather-has-a-60-year-lunar-beat/
I’d also suspect that the tide gauge data is not well suited to finding sea level cycles. Tides are highly variable dependent on local geography and variations in lunar influence (latitude and more). Frankly, I’m not sure what data would work well enough other than a global satellite daily survey for the whole planet, but we don’t have that for 200 years.
https://chiefio.wordpress.com/2013/01/29/is-there-a-sea-level/
So I’m not surprised that any one place doesn’t show a 55 – 60 year cycle. It might be interesting if the global average did, but even then I’d not think it “means much”. Probably more to do with some artifact of where the measurements were taken than actual changes in sea level / ocean depth.

May 4, 2014 7:52 pm

Aah, the joys of being a pattern-recognising animal! In the Good Old Days (good because they’re gone) we used to run away from patterns that frightened us: apply cobblers, fierce sabre-rattling tiggers, dread pylons and such. This meant we lived to pass on our pattern-recognising tendencies to our offspring. I call this survival of the frittest. [Frit was the past tense of frighten in the dialect of my youth.] These days we pay a carbon tax so that we don’t all suffer mass thermageddon. But many people seem to still be frit! So it goes…

1sky1
May 5, 2014 5:28 pm

Michael Gordon:
My comment was addressed to no one in particular, being simply an attempt to correct a common misconception that when geophysicists speak of a “quasi-periodic oscillation of ~x yrs” they do NOT imply that it is periodic. Other than those processes determined by periodic astronomical forces or the rotation of the Earth, we are dealing with RANDOM processes characterized by CONTINUOUS power densities, rather than by the LINE spectra that FFT-jockeys often presume. And the problem of aliasing is well-known and adequately controlled by competent analysts; it, along with the “Slutsky effect” oft-mentioned by novices, is a red herring.
Tomorrow, I’ll report what multi-decadal oscillations I found in some sea level data.

Michael Gordon
Reply to  1sky1
May 5, 2014 9:22 pm

1sky1: Thank you for your reply of May 5, 2014 at 5:28 pm.
Over the weekend I have taken some time to study WAVELETS as recommended by at least a couple of readers (or the same reader several times) and readily see that wavelets produce information more useful in this context — phenomena that seem periodic but come and go (more or less chaotic in other words).
While I’ve heard of wavelets over the years, the link provided by a reader proved to explain it quite simply which I will here summarize. Consider a moving-window DFT/FFT that can detect periodic signals that might vanish with time. The problem with moving window FFT is the abrupt transition generates harmonics and false signals (ringing). So, shape the window with a Gaussian envelope. That eliminates ringing. The next problem is varying resolution — you have only two cycles of the longest period but dozens of the shortest. The wavelet is chosen to have the same number of cycles regardless of its test wavelength.
The result is going to be a 3D plot: wavelength, timeline, and intensity or energy at that wavelength and timeline.
It’s a little harder to implement than DFT but not by a lot. It’s utility is that it can readily reveal periodicity AND it can reveal the comings and goings of periodicity — signs of chaotic behavior.
The only time I’ve ever seen real line spectra in FFT is hunting for ships and submarines, sometimes with harmonics if a bearing is going out 🙂

1sky1
May 6, 2014 6:39 pm

Michael Gordon:
Wavelet analysis is very useful when dealing with data that, for one reason or another, are statistically non-stationary. That is the case with internal-wave packets that may appear only sporadically, or with slow, wholesale changes of signal characteristcis observed in changing sea-states. But those are not manifestations of periodicity, as such. Indeed, aside from tides, signals characterized by line spectra are quite rare in geophysics, They are usually associated with diurnal or annual cycles. In the context of sea-level data, the less said about ship-hunting applications, the better.
Because the .cvs format of PSMSL data presents obstacles to loading it into my analysis programs, I’ve managed to take only a cursory look at the monthly San Francisco record and the annual data for Cascais. As expected, the acf of the former shows the persistent periodicity of the annual cycle. Upon detrending, the Cacais data immediately reveals a strong, albeit irregular, multidecadal oscillation to the naked eye. The maximum departure from the trend occurs in 1895, followed by a second strong peak peak in 1963. Resorting provisionally to simple interpolation, instead predictive filters, to fill the gaps produces an acf with a strong negative minimum at a lag of 28yrs, followed by an upward zero-crossing at the 44yr lag. This is entirely consistent with an irregular, narrow-band “quasi-periodic oscillation” in the ~60yr range.
More on this tomorrow.

1sky1
May 7, 2014 5:56 pm

Michael Gordon:
Below is the estimated power density spectrum of the detrended and interpolated Cascais data series, normalized by to express the fractional contribution of each frequency band to total sample variance. The frequency index k denotes the number cycles per 66 years and the spectral estimates vary as chi-squared with only ~6 degrees of freedom. The miserably wide confidence interval is a consequence of attempting to resolve multidecadal components in a record\ of only 112yrs. Obviously, greater resolution could be obtained via the raw periodogram with 2dof, but the confidence interval then would be horrendous. For brevity, only the first half of the frequency baseband is presented here; it accounts for 90% of total variance and the density at higher frequencies is down to the noise level.
0 0.156349
1 0.26658
2 0.121605
3 0.063852
4 0.093805
5 0.064397
6 0.036742
7 0.021372
8 0.021294
9 0.019332
10 0.007961
11 0.006061
12 0.00624
13 0.003018
14 0.003083
15 0.005073
16 0.00362
If you have any questions or comments, please let me know here. Otherwise, I’ll take up the subject on a different thread.

1sky1
May 7, 2014 6:00 pm

Oops. WordPress eliminated the tab between the index in the first column and values in the second.

1 4 5 6