Riding a Pseudocycle

Guest Post by Willis Eschenbach

Loehle and Scafetta recently posted a piece on decomposing the HadCRUT3 temperature record into a couple of component cycles plus a trend. I disagreed with their analysis on a variety of grounds. In the process, I was reminded of work I had done a few years ago using what is called “Periodicity Analysis” (PDF).

A couple of centuries ago, a gentleman named Fourier showed that any signal could be uniquely decomposed into a number of sine waves with different periods. Fourier analysis has been a mainstay analytical tool since that time. It allows us to detect any underlying regular sinusoidal cycles in a chaotic signal.

Figure 1. Joseph Fourier, looking like the world’s happiest mathematician

While Fourier analysis is very useful, it has a few shortcomings. First, it can only extract sinusoidal signals. Second, although it has good resolution as short timescales, it has poor resolution at the longer timescales. For many kinds of cyclical analysis, I prefer periodicity analysis.

So how does periodicity analysis work? The citation above gives a very technical description of the process, and it’s where I learned how to do periodicity analysis. Let me attempt to give a simpler description, although I recommend the citation for mathematicians.

Periodicity analysis breaks down a signal into cycles, but not sinusoidal cycles. It does so by directly averaging the data itself, so that it shows the actual cycles rather than theoretical cycles.

For example, suppose that we want to find the actual cycle of length two in a given dataset. We can do it by numbering the data points in order, and then dividing them into odd- and even-numbered data points. If we average all of the odd data points, and we average all of the even data, it will give us the average cycle of length two in the data. Here is what we get when we apply that procedure to the HadCRUT3 dataset:

Figure 2. Periodicity in the HadCRUT3 global surface temperature dataset, with a cycle length of 2. The cycle has been extended to be as long as the original dataset.

As you might imagine for a cycle of length 2, it is a simple zigzag. The amplitude is quite small, only plus/minus a hundredth of a degree. So we can conclude that there is only a tiny cycle of length two in the HadCRUT3.

Next, here is the same analysis, but with a cycle length of four. To do the analysis, we number the dataset in order with a cycle of four, i.e. “1, 2, 3, 4, 1, 2, 3, 4, 1, 2, 3, 4 …”

Then we average all the “ones” together, and all of the twos and the threes and the fours. When we plot these out, we see the following pattern:

Figure 3. Periodicity in the HadCRUT3 global surface temperature dataset, with a cycle length of 4. The cycle has been extended to be as long as the original dataset.

As I mentioned above, we are not reducing the dataset to sinusoidal (sine wave shaped) cycles. Instead, we are determining the actual cycles in the dataset. This becomes more evident when we look at say the twenty year cycle:

Figure 4. Periodicity in the HadCRUT3 dataset, with a cycle length of 20. The cycle has been extended to be as long as the original dataset.

Note that the actual 20 year cycle is not sinusoidal. Instead, it rises quite sharply, and then decays slowly.

Now, as you can see from the three examples above, the amplitudes of the various length cycles are quite different. If we set the mean (average) of the original data to zero, we can measure the power in the cyclical underlying signals as the sum of the absolute values of the signal data. It is useful to compare this power value to the total power in the original signal. If we do this at all possible frequencies, we get a graph of the strength of each of the underlying cycles.

For example, suppose we are looking at a simple sine wave with a period of 24 years. Figure 5 shows the sine wave, along with periodicity analysis in blue showing the power in each of the various length cycles:

Figure 5. A sine wave, along with the periodicity analysis of all cycles up to half the length of the dataset.

Looking at Figure 5, we can see one clear difference between Fourier analysis and periodicity analysis — the periodicity analysis shows peaks at 24, 48, and 72 years, while a Fourier analysis of the same data would only show the 24-year cycle. Of course, the apparent 48 and 72 year peaks are merely a result of the 24 year cycle. Note also that the shortest length peak (24 years) is sharper than the longest length (72-year) peak. This is because there are fewer data points to measure and average when we are dealing with longer time spans, so the sharp peaks tend to broaden with increasing cycle length.

To move to a more interesting example relevant to the Loehle/Scafetta paper, consider the barycentric cycle of the sun. The sun rotates around the center of mass of the solar system. As it rotates, it speeds up and slows down because of the varying pull of the planets. What are the underlying cycles?

We can use periodicity analysis to find the cycles that have the most effect on the barycentric velocity. Figure 6 shows the process, step by step:

Figure 6. Periodicity analysis of the annual barycentric velocity data. 

The top row shows the barycentric data on the left, along with the amount of power in cycles of various lengths on the right in blue. The periodicity diagram at the top right shows that the overwhelming majority of the power in the barycentric data comes from a ~20 year cycle. It also demonstrates what we saw above, the spreading of the peaks of the signal at longer time periods because of the decreasing amount of data.

The second row left panel shows the signal that is left once we subtract out the 20-year cycle from the barycentric data. The periodicity diagram on the second row right shows that after we remove the 20-year cycle, the maximum amount of power is in the 83 year cycle. So as before, we remove that 83-year cycle.

Once that is done, the third row right panel shows that there is a clear 19-year cycle (visible as peaks at 19, 38, 57, and 76 years. This cycle may be a result of the fact that the “20-year cycle” is actually slightly less than 20 years). When that 19-year cycle is removed, there is a 13-year cycle visible at 13, 26, 39 years etc. And once that 13-year cycle is removed … well, there’s not much left at all.

The bottom left panel shows the original barycentric data in black, and the reconstruction made by adding just these four cycles of different lengths is shown in blue. As you can see, these four cycles are sufficient to reconstruct the barycentric data quite closely. This shows that we’ve done a valid deconstruction of the original data.

Now, what does all of this have to do with the Loehle/Scafetta paper? Well, two things. First, in the discussion on that thread I had said that I thought that the 60 year cycle that Loehle/Scafetta said was in the barycentric data was very weak. As the analysis above shows, the barycentric data does not have any kind of strong 60-year underlying cycle. Loehle/Scafetta claimed that there were ~ 20-year and ~ 60-year cycles in both the solar barycentric data and the surface temperature data. I find no such 60-year cycle in the barycentric data.

However, that’s not what I set out to investigate. I started all of this because I thought that the analysis of random red-noise datasets might show spurious cycles. So I made up some random red-noise datasets the same length as the HadCRUT3 annual temperature records (158 years), and I checked to see if they contained what look like cycles.

A “red-noise” dataset is one which is “auto-correlated”. In a temperature dataset, auto-correlated means that todays temperature depends in part on yesterday’s temperature. One kind of red-noise data is created by what are called “ARMA” processes. “AR” stands for “auto-regressive”, and “MA” stands for “moving average”. This kind of random noise is very similar observational datasets such as the HadCRUT3 dataset.

So, I made up a couple dozen random ARMA “pseudo-temperature” datasets using the AR and MA values calculated from the HadCRUT3 dataset, and I ran a periodicity analysis on each of the pseudo-temperature datasets to see what kinds of cycles they contained. Figure 6 shows eight of the two dozen random pseudo-temperature datasets in black, along with the corresponding periodicity analysis of the power in various cycles in blue to the right of the graph of the dataset:

Figure 6. Pseudo-temperature datasets (black lines) and their associated periodicity (blue circles). All pseudo-temperature datasets have been detrended.

Note that all of these pseudo-temperature datasets have some kind of apparent underlying cycles, as shown by the peaks in the periodicity analyses in blue on the right. But because they are purely random data, these are only pseudo-cycles, not real underlying cycles. Despite being clearly visible in the data and in the periodicity analyses, the cycles are an artifact of the auto-correlation of the datasets.

So for example random set 1 shows a strong cycle of about 42 years. Random set 6 shows two strong cycles, of about 38 and 65 years. Random set 17 shows a strong ~ 45-year cycle, and a weaker cycle around 20 years or so. We see this same pattern in all eight of the pseudo-temperature datasets, with random set 20 having cycles at 22 and 44 years, and random set 21 having a 60-year cycle and weak smaller cycles.

That is the main problem with the Loehle/Scafetta paper. While they do in fact find cycles in the HadCRUT3 data, the cycles are neither stronger nor more apparent than the cycles in the random datasets above. In other words, there is no indication at all that the HadCRUT3 dataset has any kind of significant multi-decadal cycles.

How do I know that?

Well, one of the datasets shown in Figure 6 above is actually not a random dataset. It is the HadCRUT3 surface temperature dataset itself … and it is indistinguishable from the truly random datasets in terms of its underlying cycles. All of them have visible cycles, it’s true, in some cases strong cycles … but they don’t mean anything.

w.

APPENDIX:

I did the work in the R computer language. Here’s the code, giving the “periods” function which does the periodicity function calculations. I’m not that fluent in R, it’s about the eighth computer language I’ve learned, so it might be kinda klutzy.

#FUNCTIONS

PI=4*atan(1) # value of pi

dsin=function(x) sin(PI*x/180) # sine function for degrees

regb =function(x) {lm(x~c(1:length(x)))[[1]][[1]]} #gives the intercept of the trend line

regm =function(x) {lm(x~c(1:length(x)))[[1]][[2]]} #gives the slope of the trend line

detrend = function(x){ #detrends a line

x-(regm(x)*c(1:length(x))+regb(x))

}

meanbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle means

rep(tapply(x,modline,mean),length.out=length(x))

}

countbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle number of datapoints N

rep(tapply(x,modline,length),length.out=length(x))

}

sdbyrow=function(modline,x){ #returns a full length repetition of the underlying cycle standard deviations

rep(tapply(x,modline,sd),length.out=length(x))

}

normmatrix=function(x) sum(abs(x)) #returns the norm of the dataset, which is proportional to the power in the signal

# Function “periods” (below) is the main function that calculates the percentage of power in each of the cycles. It takes as input the data being analyzed (inputx). It displays the strength of each cycle. It returns a list of the power of the cycles (vals), along with the means (means), numner of datapoints N (count), and standard deviations (sds).

# There’s probably an easier way to do this, I’ve used a brute force method. It’s slow on big datasets

periods=function(inputx,detrendit=TRUE,doplot=TRUE,val_lim=1/2) {

x=inputx

if (detrendit==TRUE) x=detrend(as.vector(inputx))

xlen=length(x)

modmatrix=matrix(NA, xlen,xlen)

modmatrix=matrix(mod((col(modmatrix)-1),row(modmatrix)),xlen,xlen)

countmatrix=aperm(apply(modmatrix,1,countbyrow,x))

meanmatrix=aperm(apply(modmatrix,1,meanbyrow,x))

sdmatrix=aperm(apply(modmatrix,1,sdbyrow,x))

xpower=normmatrix(x)

powerlist=apply(meanmatrix,1,normmatrix)/xpower

plotlist=powerlist[1:(length(powerlist)*val_lim)]

if (doplot) plot(plotlist,ylim=c(0,1),ylab=”% of total power”,xlab=”Cycle Length (yrs)”,col=”blue”)

invisible(list(vals=powerlist,means=meanmatrix,count=countmatrix,sds=sdmatrix))

}

# /////////////////////////// END OF FUNCTIONS

# TEST

# each row in the values returned represents a different period length.

myreturn=periods(c(1,2,1,4,1,2,1,8,1,2,2,4,1,2,1,8,6,5))

myreturn$vals

myreturn$means

myreturn$sds

myreturn$count

#ARIMA pseudotemps

# note that they are standardized to a mean of zero and a standard deviation of 0.2546, which is the standard deviation of the HadCRUT3 dataset.

# each row is a pseudotemperature record

instances=24 # number of records

instlength=158 # length of each record

rand1=matrix(arima.sim(list(order=c(1,0,1), ar=.9673,ma=-.4591),

n=instances*instlength),instlength,instances) #create pseudotemps

pseudotemps =(rand1-mean(rand1))*.2546/sd(rand1)

# Periodicity analysis of simple sine wave

par(mfrow=c(1,2),mai=c(.8,.8,.2,.2)*.8,mgp=c(2,1,0)) # split window

sintest=dsin((0:157)*15)# sine function

plotx=sintest

plot(detrend(plotx)~c(1850:2007),type=”l”,ylab= “24 year sine wave”,xlab=”Year”)

myperiod=periods(plotx)

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

416 Comments
Inline Feedbacks
View all comments
RACookPE1978
Editor
July 30, 2011 10:24 am

Geoff Sharp says:
July 30, 2011 at 9:42 am

Here is the barycentre velocity graph from Willis with the quasi 60 year cycle annotated.
http://tinyurl.com/2dg9u22/images/willis.png

Thank you. But, you need to add the “minus” points as well: 3 of the 4 minimums between each red dot maximum) are “lowest minimums” as well. Thus, you are seeing a (strong) 20 year cyclle – that Lief found in his analysis of the solar-planet movements! – and a much weaker 60 year cycle – that you plotted only the four “red dot” high points.
Now, go look at the sunspot (22 year) cycles of positive to positive (or negative to negative) sunspot count peaks. There also, the peaks tend to be grouped in sets of high, medium, and low counts.

Ninderthana
July 30, 2011 10:26 am

Willis Eschenbach,
You won’t see a 60 year signal because you are doing your periodic analysis on the Sun’s speed about the Barycentre. The periodicity does not exist in that data set.
You will see a 60 year periodicity if you do you periodic analysis of the Sun’s velocity about the Barycentre. Speed only deals with the magnitude of the rate of motion, while velocity deals with both the magnitude and direction of the rate of motion. The 60 year signal is
buried in the direction part of the rate-of motion of the Sun about the Barycentre.

July 30, 2011 10:30 am

Ninderthana says:
July 30, 2011 at 10:18 am
One circuit with respect to the stars means that what ever mechanism is involved, it is aligned/synchronized with the seasons here on the Earth.
The seasons are synchronized with the sun [the tropical year], not with the distant stars.

July 30, 2011 10:34 am

Geoff Sharp says:
July 30, 2011 at 9:42 am
Here is the barycentre velocity graph from Willis with the quasi 60 year cycle annotated.
http://tinyurl.com/2dg9u22/images/willis.png

Which clearly shows how insignificant the 60-yr modulation is. Thanks for pointing that out so succinctly.

Wayne
July 30, 2011 10:35 am

Here’s what I did to get a nice wavelet display in R:
library (dplR)
str (hadcrut3)
# –> Time-Series [1:1944] from 1850 to 2012: -1.757 -0.267 -0.409 -0.779 -0.552 …
hadcrut3.s <- smooth.spline (hadcrut3, spar=0.9)
hadcrut3.r <- hadcrut3 – hadcrut3.s$y # Detrending, may or may not pass muster
hadcrut3.wave <- morlet (y1=hadcrut3.r, x1=seq (from=1850, by=1/12, length.out=1944), p2=10, dj=0.1)
wavelet.plot (hadcrut3.wave2)
The resulting graph is very pretty and shows a roughly 20-year frequency across the timeframe. It's statistically significant, though it may simply be an artifact of red noise, as you note. The dplR package makes it quite easy to make this nice graph, and interestingly it is a package meant for working with tree rings.

July 30, 2011 10:36 am

RACookPE1978 says:
July 30, 2011 at 10:24 am
If you look at the original Scafetta graph you will see the low points are taken into consideration. He uses spectral analysis to create his quasi 60 year trend.

dp
July 30, 2011 10:37 am

Leif beat me to the send button with

If you assume that the cause is astrological [‘distant stars’ – e.g. whether the Sun is in Leo or some other sign] you may have a point, …

Unless someone has shown a short term (60 year) galactic influence on the local climate this is an insignificant factoid. I think that changes if we consider what happens when the local system ascends out of the galactic plane and becomes exposed to the full brunt of the disk of the Milky Way. Our current orientation in our galaxy is such that we are shielded by dust from the vast majority of our neighboring stars. What might the night sky temperature be if we were not in the shadow of the Milky Way?

July 30, 2011 10:40 am

The Monster says:
July 30, 2011 at 10:10 am
Unfortunately, with 150 years of data, if we split it into three parts, we wouldn’t even have a full 60-year cycle to look at.
Since L&S [or at least S] claims that the cause of the temperature variations is solar and related to its barycentric motions [or tides] we can investigate the source. Here is the power spectrum of 6000 years of barycentric distance [~22,000 data points]: http://www.leif.org/research/FFT-Barycenter-Distance.png There is no prominent 60-year cycle.

Paul Vaughan
July 30, 2011 10:40 am

Steve McIntyre (July 30, 2011 at 5:28 am) wrote:
“Since Sethares and Staley refer to wavelet transforms, I presume that (contra Nick Stokes) their methodology has considered wavelet methods, though I am not in a position right now to comment on whether Sethares and Staley’s method is a useful improvement on wavelets or not.”
Wavelet methods were in relative infancy in 1999. Their application has risen exponentially and with this rise has come adaptive radiation (in the sense used in evolutionary biology). I would caution against cookbook implementation of the more dull conceptions of “what wavelet methods do”. Wavelet methods are extraordinarily flexible and limited only by shortcomings of practitioners’ imaginations. Many theoreticians are HOPELESSLY blinded in practice by algebraic abstractions underpinned by untenable assumptions. And many practitioners just parrot the widely-available NARROW-SCOPE wavelet algorithms of others before hastily drawing premature (& inaccurate) conclusions about utility, rather than operating intuitively & adeptly from a base in deep conceptual understanding.
Regards.

July 30, 2011 10:46 am

Willis Eschenbach says:
July 30, 2011 at 10:36 am
Geoff Sharp says:
July 30, 2011 at 9:42 am
Here is the barycentre velocity graph from Willis with the quasi 60 year cycle annotated.
http://tinyurl.com/2dg9u22/images/willis.png
Not sure what your point is here, Geoff. Both periodicity and Fourier analysis show that the ~60-year cycle is very tiny, an order of magnitude or more smaller than the 20-year cycle. Yes, it exists, but it’s hardly significant.

Willis I am afraid you keep missing the point. Think of the 20 year cycle as a background engine, the power of that engine is controlled by another force or modulator. The engine is slowing for 30 years then speeding up for 30 years….this is the same as the PDO cycle. To ignore it would be foolish.

July 30, 2011 11:06 am

To Willis Eschenbach,
I am sorry that I need to contradict Willis, his analysis is very poor.
Our analysis is based on the correct thecniques, that is “multiple” power spectrum analisis agaist red noise background. I would like to insist on the word “multiple” because I used three alternative methods. The quasi 20 and 60 year cycles are quite evident in the data. This tests are done in Scafetta 2010. In L&S 2011 we simply references those results
Moreover similar cycles have been found by numerous other people in numerous climatic data and published in numerous data. So, ther is very little to question.
Moreover the curves shown in figure 1,2,3,4 show equal cycles which are not sinusoisal, but are clearly equal.
See for example the above 20-year modulation shown in figure 4. The cycles are not sinusoidal, but they are still perfectly “equal”.
It is very unlikely that the temperature present such a perfect repetition of cycles that would be possible only is the temperature were made of cycles with perfect period 20, 10, 5, 4, 2.
What Willis did is simply to calculate a single average cycle and then he plotted this same cycle many times in a consecutive way.
Try to use a sequence made of two cycles with period 20 and 15, then use your 20 period and you will see that your thecnique fails to properly reproduce the modulation of the curve.

July 30, 2011 11:28 am

Geoff Sharp says:
July 30, 2011 at 9:21 am
Also when looking at the 172 year cycle in the temperature or solar proxy record is not supremely evident because the cycle has multiple prongs.[…]
If we only relied on Fourier analysis the world would be a poorer place.

Sometimes just looking at the data works too [although Fourier analysis would also pick up any cycles, even if the period is not strictly constant]. There is no correlation between your U/N 172 stuff and solar activity. Here are a direct comparison for the past 6000 years. The solar activity is the ‘latest and greatest’ from Steinhilber et al. combining both 10Be and 14C. The activity data are 25-year means so wont show the solar cycle:
http://www.leif.org/research/Solar-Activity-vs.Barycenter-Distance-BC.png
http://www.leif.org/research/Solar-Activity-vs.Barycenter-Distance-AD.png
As you and everybody can clearly see there is no consistent correlation between your U/N influences [denoted by circles] and Grand Minima, or anything else for that matter.

commieBob
July 30, 2011 11:46 am

nicola scafetta says:
My assumption is that you are working with a data set consisting of average annual temperatures. You have fewer than 200 data points. Although I have zero experience with geophysical data my experience with electronic signals tells me that you should not defend your results too tenaciously. 😉

Steve from Rockwood
July 30, 2011 11:46 am

Joseph Fourier looks a little bit like my girlfriend’s hairdresser, who also by coincidence is a Ph.D mathematician who never made it to the AGW trough. /s
But I always thought Fourier transforms didn’t like non-periodic trends in the data due to the assumption of a periodic function. We always removed the trend or applied a tapering filter at either ends to force periodicity.
Willis, all your graphs seem to represent data sets that more or less start and end at the same value (of the y-axis). In the example of temperature curves, today’s temperature “background” is a positive value higher than the trough-to-peak values of the so-called periodic trends that return to the baseline. I think Fourier Analysis wouldn’t like such a data set.
Periodicity analysis would be even worse for long term trends (features of the time series that have a period much longer than the sampling period). In your Figure 6 none of the data sets shows a long term positive trend so I’m not sure you’ve proven anything. You seem to have used periodic data as input and proved periodicity analysis works.
Plus I thought Loehle and Scafetta produced a very good piece. Their numbers make sense when I eyeball the measured temperature graphs.
But I always enjoy your articles Willis – I haven’t frequency filtered anything in years.

Tad
July 30, 2011 11:56 am

I think the randomization test you’re showing is great if Loehle and Scafetta were just looking to see if any sort of periodicities existed, but they weren’t – they were fitting models on the 20- and 60-year cycles that were already known to exist. So that means that it isn’t really a multiple comparison problem, which is what your randomization approach seems to be testing (if I’m understanding the situation – a dubious assumption at best).

BarryW
July 30, 2011 12:38 pm

Ah Willis, I don’t think your red noise test really shows that their fourier analysis fails on red noise. It just shows that your periodicity test fails on it by finding a pseudo-cycle. You would have to show that a fourier analysis also produced the same results on the red noise data sets.

NikFromNYC
July 30, 2011 12:39 pm

Adding a tear to the debased nun rounds out my take on this issue by better expressing the sadness I attach to over-specialized and myopic analysis paralysis and it’s corrupting influence upon contemporary affairs, as does the addition of a photo of my green bile drooling dead cat Freddy in ’05, back when I started wasting my life on weather worry, long after I had exited science due to the victory of hype, political correctness and corporatism over substance, bravado and curiosity within academia. The time I wasted online that afternoon meant Fred died alone in the other room, for what I thought was a half hour online had turned to six so he was now stiff as a board. A section from S. Dali’s last painting covers both the last and next century to mark the end of a muddled era as illuminated by the glistening and terrible Beauty of pure mathematics.
http://i.minus.com/ijkhds.jpg
“The individual sciences of our epoch have become specialized in these three eternal vital constants the sexual instinct, the sense of death, and the space-time anguish. After their analysis, after the experimental speculation, it again becomes necessary to sublimate them. The sexual instinct must be sublimated in esthetics; the sense of death in love; and the space-time anguish in metaphysics and religion. Enough of denying; one must affirm. Enough of trying to cure; one must sublimate! Enough of disintegration; one must integrate, integrate, integrate. Instead of automatism, style; instead of nihilism, technique; instead of skepticism, faith; instead of promiscuity, rigor; instead of collectivism and uniformization individualism, differentiation, and hierarchization; instead of experimentation, tradition. Instead of Reaction or Revolution, RENAISSANCE!” – Salvador Dali (The Secret Life of Salvador Dali 1942)

William
July 30, 2011 12:52 pm

It seems the orbital position of the planets is affecting the sun, however, the analysis of the problem/observations will be difficult if there are multiple changes occurring which interact with each other. It seems based on the reasons and observations noted below the mechanism may not necessary be just gravitational effects of one body on another.
There is unequivocal paleoclimatic evidence that the earth’s climate changes on a pseudo cycle on a centennial and millennial basis (Medieval warm period, Little Ice Age, and so on), with a strong change with a period of 1470 years (plus or minus a beat frequency), and with very, very strong changes (abrupt climate events such as the Younger Dryas event or the 8200 BP abrupt cooling event, or the termination event of the last 22 interglacial periods) with a period of roughly 8000 to 10,000 years.
Paleoclimatic researchers have known for years that there is concurrent with these pseudo cycles of warming and cooling and abrupt climate changes events, cosmogenic isotope changes. What was not known is what is causing the cosmogenic isotope changes and how what was causing the cosmogenic isotope changes, could cause the planet to cool or warm. It is now apparent the cooling or warming is caused by changes that affect the amount of planetary cloud cover, the albedo of planetary clouds, and regionally the amount and albedo of the clouds that form. (The mechanism can cause some regions to have less clouds and warm and other regions to have more clouds and cool which complicates the paleoclimatic analysis.)
As I have noted, geomagnetic field specialists in the last 10 years have found that the tilt of the geomagnetic field is abruptly changing with a pseudo cycle and related to the geomagnetic field axis change that there are pseudo cyclic intensity changes to the geomagnetic field (sometimes the event reinforces the field and other times it apposes the field). The axis orientation change of the geomagnetic field, changes the poles location relative the earth’s rotational axis which changes the relative distance from the geomagnetic pole to different locations on the continents. The geomagnetic field change, in turn changes GCR intensity and magnitude at lower latitudes and at higher latitudes (Svensmark’s book explains the mechanism and how it is affected by distance from the geomagnetic pole.). i.e. The geomagnetic pole no longer aligns with the rotational axis of the planet. After the event the geomagnetic field integrates the change causing the geomagnetic field intensity to increase or decrease. There are unexplained cycles of geomagnetic field intensity.
The geomagnetic field change has a long term affect on the planet’s climate. That explains how a short term solar event can have a long term affect on the planet’s climate. (i.e. 70% of the Younger Dryas cooling occurred in roughly 10 years with almost of 100% the cooling in 100 years and the cooling lasted for around 1200 years. The solar magnetic field cycle or TSI does not reduce for 1400 years.)
Further complicating the after the fact analysis of the cycles and abrupt changes – based on an assumed mechanism where the sun is the cause of the observed change – how much the solar event changes the geomagnetic field depends on the eccentricity of the earth’s orbit, the tilt of the earth at the time of the event, the timing of perihelion at the time of event, and whether there are insulating ice sheets on the planet. (Think of a large solar event -there are also smaller more frequent solar events – that occurs roughly at a frequency of 8000 to 10,000 years now see how the tilt of the planet and timing of perihelion has changed between events to change the effect of the event on the geomagnetic field.) There are also smaller geomagnetic field tilt changes with a periodicity of roughly 400 years.
There must be a physical reason, a cause as to what is changing the geomagnetic field. The fact that there are roughly a hundred papers noting cosmogenic isotopes changes correlate with climate changes is smoking gun evidence that the sun is the serial climate changer. There appears to be no earth mechanism that can change the geomagnetic field as rapidly and with observed cyclic timing (there are physical limits as to how fast a core based change can affect the total geomagnetic field due to counter acting EMF fields that are generated in the liquid core and there is no physical event that abruptly cause core base changes which in turn could cause the geomagnetic field to change, core based changes are orders of magnitude slower.) If the assertion that a core based change is physically capable of causing the geomagnetic field observations then it seems there must be some pseudo cyclical solar event that is causing the geomagnetic field to change. There appears to be no other logical possible cause. There must be a physical cause to what is observed.
There are a whole suite of astrophysical anomalies that could possibly be explained by the fundamental reason why the sun is changing and how it could affect the geomagnetic field and the magnetic field of the other planets (For example a cyclic abrupt solar event could possibly explain the anomalous orientation of the Uranus and Neptune magnetic field where the field does not align with the planet’s rotational axis and is further more off set from the center of the planet’s core.).
A possible methodology to develop the mechanism is to start with a strawman of the fundamental mechanism then to look for anomalies to outline and define the fundamental mechanism. For example, I found an interesting series of papers written to explain very strong magnetic fields associated with quasars and cyclic monotonically increasing long scale changes of quasar spectrum, and so forth, which indicate the collapse of a large object does not form a black hole. The object formed is not stable and gradually breaks up with an electromagnetic mechanism which explains the very strong magnetic field, jets, and ejected material. Perhaps the object formed for the collapse of large objects could be similar for the collapse of a super nova.
There are at least a dozen different groups/individuals from four or five different specialties that have published papers concerning observations and anomalies that appear to be related to this mechanism. Observational data concerning the astrophysical anomalies is improving. Perhaps an answer will come out of that work.
http://www.sciencedirect.com/science/article/pii/S1364682610004074
Sun–earth relationship inferred by tree growth rings in conifers from Severiano De Almeida, Southern Brazil
This study of Sun–Earth relationships is based on tree growth rings analysis of araucarias (Araucaria angustifolia) collected at Severiano de Almeida (RS) Brazil. A chronology of 359 years was obtained, … periods of solar activity of 11 (Schwabe cycle), 22 (Hale cycle), and 80 (Gleissberg cycle) years. The result shows the possible influence of the solar activity on tree growth in the last 350 years. Periods of 2–7 years were also found and could represent a response of the trees to local climatic conditions. Good agreement between the time series of tree growth rings and the 11 year solar cycle was found during the maximum solar activity periods.
http://ruby.fgcu.edu/courses/twimberley/EnviroPhilo/LongPeriod.pdf
LONG-PERIOD CYCLES OF THE SUN’S ACTIVITY RECORDED IN DIRECT SOLAR DATA AND PROXIES
Abstract. Different records of solar activity (Wolf and group sunspot number, data on cosmogenic isotopes, historic data) were analyzed by …. It was confirmed that two long-term variations in solar activity …. of 50–80 years and 90–140 year periodicities. The structure of the Suess cycle is less complex showing a variation with a period of 170–260 years. Strong variability in Gleissberg and Suess frequency bands was found in northern hemisphere temperature multiproxy that confirms the existence of a long-term relationship between solar activity and terrestial climate.
Stuiver and Braziunas (1993) analyzed the long decadal 14C series and found significant 89 and 148 year periodicities for 6000–2000 B.C. and a 126-year variation for 2000 B.C.–1840 A.D. Existence of two kinds of century-long solar variability – 115 year and 95 year cycles – was claimed by Chistyakov (1986). …variations in the Gleissberg and Suess frequency range, using all the complexity of direct and indirect solar data and applying modern statistical methods. The link between solar activity and terrestrial climate is also considered.
The following is a link to Bond’s paper “Persistent Solar influence on the North Atlantic Climate during the Holocene”
http://www.essc.psu.edu/essc_web/seminars/spring2006/Mar1/Bond%20et%20al%202001.pdf

July 30, 2011 1:04 pm

Steve from Rockwood says:
July 30, 2011 at 11:46 am
Willis, all your graphs seem to represent data sets that more or less start and end at the same value (of the y-axis). In the example of temperature curves, today’s temperature “background” is a positive value higher than the trough-to-peak values of the so-called periodic trends that return to the baseline. I think Fourier Analysis wouldn’t like such a data set.
It actually works quite well. Here are four cases of the function a*t+sin(t), where the sine curve will have a period of 2pi=6.3: http://www.leif.org/research/FFT-Periods-with-Trends.png where a varies from 0 [no trend] to a rather extreme a = 0.1 where the trend over the ~40 cycles is about 12 times larger than the amplitude of the sine-curve [peak to valley]. In all cases the FFT picks up a clear peak at 6.3.