Guest Post by Willis Eschenbach
A look at Gleissberg’s famous solar cycle reveals that it is constructed from some dubious signal analysis methods. This purported 80-year “Gleissberg cycle” in the sunspot numbers has excited much interest since Gleissberg’s original work. However, the claimed length of the cycle has varied widely.
Back in the 1940s, a man named Wolfgang Gleissberg was studying sunspot cycles. To do so, in his own words, he introduced a new method, viz:
When I introduced the method of secular smoothing into the study of the variations of sunspot frequency (GLEISSBERG, 1944) I published a table containing the secularly smoothed epochs and ordinates of sunspot minima and maxima which I had deduced from the data published by BRUNNER in 1939. Since then, secular smoothing has proved to be one of the principal methods for investigating the properties of the 80-year cycle of solar activity (cf. RUBASHEV, 1964).
Figure 1. SIDC sunspot data, along with the “best-fit” sine wave for each cycle length from 40 years (orange, in back) to 120 years (blue, in front). Heavy black and heavy red horizontal sine waves show respectively the strength of the 80-year “Gleissberg Cycle” and the 102-year maximum-amplitude cycle.
This purported 80-year “Gleissberg cycle” in the sunspot numbers has excited much interest since Gleissberg’s original work. However, the claimed length of the cycle has varied widely. One source says:
In different studies the length of the period of the secular variation was determined to be equal to 95 years, 65 years, 55 years, 58 years, 83 years, 78.8 years, 87 years [Siscoe, 1980; Feynman and Fougere, 1984]. That situation is understandable, because the longest record of direct observations of solar activity was and still is the sunspot numbers which provides more or less reliable information since 1700 (see below). That gives one only 300 years of time span by now which encompasses ~3.4 periods of Gleissberg cycle which is quite low for its statistical analysis.
So what was Gleissberg’s “secular smoothing” method that he “introduced” in 1944? Well, it turns out to be a simple 1-2-2-2-1 trapezoidal filter … but one which he employed in a most idiosyncratic and incorrect manner.
Let’s start, though, by looking up at Figure 1. It shows the three centuries of sunspot data in black, along with actual best fit sine waves in color, year by year, for each cycle length from forty years (colored orange, in the back) to one hundred twenty years (colored blue, in the front). Of particular interest are the 80-year cycle proposed by Gleissberg (heavy wavy horizontal black line), and the largest long-term cycle, which is 102 years in length (heavy wavy horizontal red line). As you can see, the 80-year “Gleissberg cycle” is not distinguished in any way.
So … does this mean that in fact there is a 102-year cycle in the sunspot data? Well, no. We still only have data enough for three 102-year cycles. And in natural data, that’s not very reliable. The problem is that nature appears to be chaotic on all timescales, so I’m not trusting the 102-year cycle to stick around. But in any case … just how did Gleissberg get to his 80-year number? Therein lies a tale …
First, Gleissberg decided that what we’re looking at in Figure 1 is an amplitude modulated signal. So he figured he only had to deal with the envelope of the signal, which looks like this:
Figure 2. Envelope of the sunspot record shown in color. As an aside, it turns out to be a curiously tricky algorithm that is needed to identify true maxima, or true minima.
Having gotten that far, he threw away everything but the envelope, leaving only the following information:
Figure 3. Envelope only of the sunspot record, maximum envelope shown in red, minimum envelope shown in blue.
And that poor misbegotten stepchild of a once-proud record was what he analyzed to get his 80-year cycle … sorry, just kidding. That would be far too simple. You see, the problem is that when you look at that envelope data in Figure 3, there are no evident long-term cycles in there at all. It’s just not happening.
To get around the minor issue that the data has no obvious cycles, Gleissberg applies his whiz-bang “secular smoothing” algorithm to the maximum and minimum envelope data, which gives the following result. Remember, there are no obvious cycles in the actual envelope data itself …
Figure 4. Result of “secular smoothing” of the maxima and minima envelopes of the sunspot data. Dotted vertical line marks 1944, the year that Gleissberg introduced “secular smoothing” to the world.
And voilá! Problem solved.
The big difficulty, of course, is that smoothing data often creates entirely specious cycles out of thin air. Look at what happens with the maximum envelope at 1860. In the original maximum data (light red), this is a low point, with peaks on either side … but after the filter is applied (dark red), it has magically turned into a high point. Smoothing data very commonly results in totally factitious cycles which simply do not exist in the underlying data.
There are a couple of other problems. First, after such a procedure, we’re left with only 24 maximum and 24 minimum datapoints. In addition, they are strongly autocorrelated. As a result, whatever conclusions might be drawn from Gleissberg’s reduced dataset will be statistically meaningless.
Next, applying a trapezoidal filter to irregularly spaced data as though they were spaced regularly in time is a big no-no. A filter of that type is designed to be used only on regularly spaced data. It took me a while to wrap my head around just what his procedure does. It over-weights long sunspot cycles, and under-weights short cycles. As a result, you’re getting frequency information leaking in and mixing with your amplitude information … ugly.
Finally, if you read his description, you’ll find that not only has he applied secular smoothing to the amplitudes of the maxima and minima envelopes. Most curiously, he has also applied his wondrous secular smoothing to the times of the maxima and minima (not shown). Is this is an attempt to compensate for the problem of using a trapezoidal 1-2-2-2-1 filter on irregularly spaced data? Unknown. In any case, the differences are small, a year or so one way or the other makes little overall difference. However, it likely improves the (bogus) statistics of the results, because it puts the data at much more regular intervals.
CONCLUSIONS:
First, the method of Gleissberg is unworkable for a variety of reasons. It results in far too few datapoints which are highly autocorrelated. It manufactures cycles out of thin air. It mixes frequency information with amplitude information. It adjusts the time of the observations. No conclusions of any kind can be drawn from his work.
Next, is the 80-year cycle described by Gleissberg anywhere evident in the actual sunspot data? Not anywhere I can find. There is a very wide band of power in the century-long range in the sunspot data, as shown in Figure 1. However, I don’t trust it all that much, because it changes over time. For example, you’d think that things would kind of settle down over two centuries. So here’s the first two centuries of the sunspot data …
Figure 5. As in Figure 1, but only the earlier two centuries of the sunspot data.
Note that in the early data shown in Figure 5, there is very little difference in amplitude between the 80-year Gleissberg cycle, and the 95-year maximum amplitude cycle. You can see how Gleissberg could have been misled by the early data.
Now, let’s look at the latter two centuries of the record. Remember that this pair of two-century datasets have the middle century of the data in common …
Figure 6. As in Figure 5, but for the latter two centuries of the data.
In this two-century segment, suddenly the maximum is up to 113 years, and it is 2.5 times the size of the 80-year Gleissberg cycle.
In none of these views, however, has the 80-year Gleissberg cycle been dominant, or even noteworthy.
Please note that I am NOT saying that there are no century-long cycles, either in the sunspot data or elsewhere. I am making a careful statement, which is that to date there appears to be power in the sunspot data in the 95-120 year range. We can also say that to date, the power in the 80-year cycle is much smaller than anything in the 95-120 year range, so an 80-year “Gleissberg cycle” is highly unlikely. But we simply don’t have the data to know if that power in the century-long range is going to last, or if it is ephemeral.
Note also that I am saying nothing about either 80-year Gleissberg cycles, or any other cycles, in any climate data. This is just the tip of the Gleissberg. So please, let me ask you to keep to the question at hand—the existence (or not) of a significant 80-year “Gleissberg cycle” in the sunspot data as Gleissberg claimed.
Finally, if you are talking about e.g. a 85 year cycle, that’s not a “pseudo-80 year cycle”. It’s an 85 year cycle. Please strive for specificity.
My best wishes to all,
w.
Claimer (the opposite of “disclaimer”?): If you disagree with anything I’ve written, which did actually happen once a couple years ago, please quote the exact words that you disagree with. Often heated disagreements stem from nothing more than simple misunderstandings.
Data: The adjusted SIDC data is available as SIDC Adjusted Sunspots 1700 2012.csv . In accordance with the advice of Leif Svalgaard, all values before 1947 have been increased by 20% to account for the change in sunspot counting methods. It makes little difference to this analysis
We need the solar physicists to explain these cycles, looking first of all at straightforward resonance and calculating the resonant frequencies to be expected in our star.
The effect of this variation on the Earth is another matter. I have a conjecture that Earth’s atmosphere responds to solar variation after a timelag (due to ocean circulation) of 99 years: http://endisnighnot.blogspot.co.uk/2012/03/lets-get-sorted.html
You took a beloved paper and tore it to shreds with pure logic.
There should be a website dedicated to doing this daily – if more scientists did this, the world would be a lot better off…
Here are some Talking Heads lyrics to the song, “Paper.”
There are parallels here, I think…
PAPER
Hold the paper up to the light
(some rays pass right through)
Expose yourself out there for a minute
(some rays pass right through)
Take a little rest when the rays pass through
Take a little time off when the rays pass through
Go ahead and mess it up…Go ahead and tie it up
In a long distance telephone call
Hold on to that paper
Hold on to that paper
Hold on because it’s been taken care of
Hold on to that paper
See if you can fit it on the paper
See if you can get it on the paper
See if you can fit it on the paper
See if you can get it on the paper
Had a love affair but it was only paper
(some rays they pass right through)
Had a lot of fun, could have been a lot better
(some rays they pass right through)
Take a little consideration, take every combination
Take a few weeks off, make it tighter, tighter
But it was never, it was never written down
Still might be a chance that it might work out (if you)
Hold on to that paper
Hold on to that paper
Hold on because it’ll be taken care of
Hold on to that paper
Don’t think I can fit it on the paper
Don’t think I can get it on the paper
Go ahead and rip up, rip up the paper
Go ahead and tear up, tear up the paper
Thank you again Willis for your good work and your fresh look at historical data.
Willis, we disagree on geology. That debate is foe another place. But we totally agree on scientific methodology, plus on your equatorial thermostatic regulator. Congrats again here on another climate debunking.
You had me at “envelope”.
Willis, this should be published in a journal if somebody hasn’t already sliced and diced this thing.
Apparently the practice of torturing innocent data to get a publishable confession has a long history and is not restricted to the world of climate science.
Your efforts to help clear the record of such artifacts deserve praise and support.
Willis I look forward to you posts. Are you going to put these past pieces into book form? If so I will buy the first copy. Keep up the good work.
It seems that the sunspot range (amplitude) over time is what matters (say Fig 2), dealing with max and min activity makes curve fitting difficult. Would making the mimima activity a straight line help with determining maxima shape and periodicity? To me it simplifies the range determination.
I am not a statistician so let me know if this is silly.
It is a curious characteristic of mankind that many of us have a level of cycle mania where we try to find and explain complex natural and physical phenomena that we don’t understand very well by using simple to understand but usually wrong derivations of supposed cycles of varying duration, amplitude and phase to presumably explain the so far unexplained phenomena.
Cycles of course, are intrinsic to so much of the natural world which we see and experience and have done so through the long ages of the existence of the human species.
Day and night, seasons, moon phases and it’s variations, sunspots and even human and animal biology particularly of the female of the species, all cycles, all reasonably predictable.
So we continue to extend the theory of cyclic behavior into other fields in the usually vain hope it can be used to explain yet another natural phenomena.
And we try to find those cycles and do find them where none exist but which eventually are seen to be artifacts created by the manner in which the research was done.
And so we advance three steps forward, two steps back and fortunately for our species, only just occasionally, four steps back but always eventually, forward again
Nice, thanks.
====
I think you probably mean excited, but perhaps you do mean exited.
[Done. Thank you. Mod]
In a chaotic system you cannot expect real cycles. There are nevertheless peaks and valleys in the sunspot data which must be taken into acount
I, too, look forward to your posts, which always contain a wonderful mix of insight and commonsense, scientific method, good old-fashioned innocent curiosity all topped off with good humour.
As Anthony said, this should be a paper, but published here is good enough for me as the gatekeepers tend to leave posts on WUWT alone.
Willis –
Yes – the processing can make its own stuff!
There is of course the famous saying “Garbage-In, Garbage-Out” (GIGO) which I think applies here, in essence because the original data was just inherently inadequate to “provision” the parameter extraction claimed. The data supports just so much and no more.
Possibly in the “old days” with far less computing time, a lot more thought went into (or should have gone into) choosing a method of analysis, and less into crunching data to see if anything apparently interesting floated to the top. More thought might seem better – but it coincided with less testing of methods. Today there is a reverse tendency to suppose that we just need to keep chugging. More statistical processing please! Anything you like yet?
So I think that GIGO has the ADDITIONAL meaning that IF you put in garbage (random test signals) and get something GOOD out (not garbage), you ARE in serious trouble. (For example, the famous “hockey stick”.) Filters and models can make their own stuff.
Below are two studies I did recently, using only random test signals, the first relating to the trapezoidal smoothing (peripherally involved here), and the second in response to your “Parana River” posting back in January (spurious band-pass). I used only random data – Anything “seen” in my graphs is only apparent.
http://electronotes.netfirms.com/AN401.pdf
http://electronotes.netfirms.com/AN403.pdf
Possibly with regard to time-series processing: LESS is MORE?
Interesting article, and very instructive on how applying seemingly innocuous transformations to data can garble it. Though “factitious” would seem to mean the exact opposite of “fictitious”, which the context seems to demand.
Alexander K says:
May 17, 2014 at 9:32 pm
Thanks for that, Alexander. There are lots of things that I should write up for the journals, but I feel like I have to give myself an autolobotomy to write in the style that the journals seem to prefer.
Plus, the real joy for me is in doing the research and writing the computer programs and running them to see what the graphs look like. That’s the pleasure of the chase for me — the writing of the papers is just something I do to purge my mind of the results and to learn what I can from the comments about what I’ve done wrong and to make room for the next “oooh, shiny!” moment when I go haring off after something completely different.
Regards,
w.
Gone are the days when mathematics can validate an easy Newtonian clockwork. It’s kind of like mining gold or drilling oil. All the easy stuff is gone. Now you have do drill down a mile, turn a corner and drill another mile sideways and detonate some charges in your casing.
Ours is the quantum era where everything is weird and seemingly designed to deceive us. Our only cipher in the quantum arena is mathematics, but math has always been a Faustian bargain where certainty is bought at the price of initial conditions or parameters that may or not be correct.
In physics these initial conditions and the following math has led us to string theory and parallel universes.
This is why I follow fish. They share with me an utter disbelief in parallel universes. Now the cycles fish follow and which we can measure with weigh master’s certificates may well be ephemeral. They may not have existed before the weigh masters and they may be overwritten in the future, but for right now they are as real as it gets.
I find the longer the ‘cycle’ the least its plausibility. It’s already hard to imagine a mechanism that takes 22y to come around, unless one looks at planetary influences. But 80y or more? What would there be in the sun to reset every century?
Sunspot chatter. I love it.
After reading your article Willis, the only fault that seems to stand out is the lack of limits to your process.
The Gleissberg cycle, when applied to Schove, can vary anywhere from 72 to 83 years or 77.6
Schove also wrote,”The 78-year cycle [17] is clearly shown since 1610 by an alternation of periods of shortening (c. 1650-1700, c. 1725-65, c. 1890-1930) and lengthening cycles
(c. 1700-25, c. !765-1810, c. 1845-90).
Other datasets have been used, like Elatina, that also show a strong correlation to the 78 year cycle.
Other cycles tend to pop out of these datasets, and it only makes me wonder if these cycles are but like gears in a clock.
Someone, someday will work it out.
Thanks for replying to another comment of my by making it to a full post
howver,
I refer to table 2 here
http://virtualacademia.com/pdf/cli267_293.pdf
and you will note with me that with only sunspots as guideline you [really] have only touched the tip of the Gleiszberg.
I am almost sure that Gleiszberg also looked at the flooding of river and lakes, but I will have to do some investigating on that. In the meantime my best fit for the drop in the speed of maxima stands, which was not even mentioned in the table.
http://blogs.24.com/henryp/2012/10/02/best-sine-wave-fit-for-the-drop-in-global-maximum-temperatures/
omnologos says
http://wattsupwiththat.com/2014/05/17/the-tip-of-the-gleissberg/#comment-1639152
henry says
you have to step off from sunspots (which is a subjective measurement) and rather move to the strength of the polar fields of the sun
http://ice-period.com/wp-content/uploads/2013/03/sun2013.png
Note with me that you draw a binomial best fit from the top (hyperbolic) and from the bottom (parabolic) which would show that the minimal strengths wll be reached around 2016.
Note also the 2 past Hale cycles (44 years) for which we have reliable data.
Any one with a slight scientific brain can easily predict what the next 44 years of polar field strengths will look like….???
The interaction of the magnetosphere of Jupiter and Saturn with sunny field is much greater than we think.
The layer of metallic hydrogen within the fast spinning Jupiter generates an enormous magnetic field around the planet. It’s magnetic field is much stronger than that of Earth’s (which is generated by turning of its iron core) or any of the other planets. It also follows then that Jupiter has the largest magnetosphere in the solar system, which is a tear shaped bubble of charged particles constrained by Jupiter’s magnetic field. The variable solar wind flow interacts with the planets magnetosphere and as a result shapes it.
http://www3.imperial.ac.uk/spat/research/missions/space_missions/cassini/events/fly_by_schedule/jupiter_fly_by/jupiter_magnetosphere
Saturn is surrounded by a giant magnetic field, lined up with the rotation axis of the planet. This cannot be explained by current theories. Cassini may explain how the puzzling magnetic field of Saturn is generated.
http://www.esa.int/Our_Activities/Space_Science/Cassini-Huygens/Saturn_s_magnetosphere
Since the cyclical change position relative to the sun their impact on the solar dynamo is cyclical.
Why should the measurement of a cycle of the sun be in years? One earth year is of no relevance to the sun. Even one Jupiter revolution of the sun is almost as irrelevant unless you also look at where the other planets are in relation to Jupiter and the sun.
Would Ios,Europa and Gannymead have got into their synchronous orbits without tidal forces? Surely when trying to understand the solar cycles one has to look at the solar system as a whole.
There are some cycles in the climate that are indisputable. The 100,000 year ice ages followed by the relatively short interglacials for example. And these have to do with the sun. That there maybe shorter, smaller, cycles is also possible. There have been warm and cold periods within our Holocene for example
Thought provoking as always Willis.
The trapezoidal filter is a common variant of the infamous running. It is the same as RM with the first and last points given 50% weighting. It’s very nearly as bad as RM and has the same defect of introducing spurious frequencies that are not in the original data. 70 years after Gliessberg, and with the availability of immense computer power on everyone’s lap, these are sadly just as popular.
“Look at what happens with the maximum envelope at 1860. In the original maximum data (light red), this is a low point, with peaks on either side … but after the filter is applied (dark red), it has magically turned into a high point. ”
Well that’s the kind of thing that can happen with runny means but it’s not quite as bad as make it sound. In this case there are 3 high data points and one less high one. Even a decent low-pass filter would produce a similar result. Applying a convolution filter to irregular data is also questionable. It seems he did try to address this but did not clearly explain it enough for you to reproduce it. I doubt this has a major effect on the presence or not of the supposed 80y period.
However, Willis, you do have a technique that does not require regularly spaced data, so why not apply it to the min and max points in figure 3 if you want to cross-check his claim about that aspect of the data? You could also apply a better filter: a gaussian with sigma=1 (one circa ten year interval) for example.
I suspect you will find something similar.
Assuming that is the case, you could then ask why this period comes out from the peaks but not the full series as you mainly did here.