It has been awhile since I’ve looked at the Ap Index. The last time was April of 2009.
From the data provided by NOAA’s Space Weather Prediction Center (SWPC) you can see just how little Ap magnetic activity there has been since. Here’s my graph from September 2009 SWPC Ap data:

For a longer perspective, David Archibald, has a graph of the Ap Index back to 1932. The solar average geomagnetic planetary index, in Dec 2008, Ap was at its lowest level in 75 years:
Click for a larger image – I’ve added some annotation to the graph provided by Archibald to point out areas of interest and to clarify some aspects of it for the novice reader.
The last time the Ap index was this low was 1933. The December 2008 Ap value of 2,, has never been this low. (Note: Leif Svalgaard contends this value is erroneous, and that 4.2 is the correct value – either way, it is still lower than 1933) Further, the trend from October 2005 continues to remain low, though some signs of a slight rebound are showing.
This Ap index is a proxy that tells us that the sun is now quite inactive, and the other indices of sunspot index and 10.7 radio flux also confirm this. The sun is in a full blown funk, and your guess is as good as mine as to when it might pull out of it. So far, predictions by NOAA’s SWPC and NASA’s Hathaway have not been near the reality that is being measured.

As Leif Svalgaard points out, Ap is just one of several indices that describe geomagnetic activity. There are several others [aa, am, IHV, …] that go much further back in time [to the 1840s]. You can get more info from:
http://www.leif.org/research/IAGA2008LS.pdf and
http://www.leif.org/research/Seminar-UCLA-ESS288.pdf
For those that follow the sunspot number (SSN) I’ve graphed the Ap and SSN together. As you can see, we’ve been in a reduced state of solar activity now for quite some time. It has been almost 4 years since the prominent drop in Ap in October 2005. SSN mirrors the decline of the Ap index since then.

As many regular readers know, I’ve pointed out several times the incident of the abrupt and sustained lowering of the Ap Index which occurred in October 2005. The abrupt step change seemed (to me) to be out of place with the data, and since then the data seems less “active”, with reduced amplitudes. And then we have the fact that the sun seems to have reestablished at a lower plateau of the Ap index after that October 2005 step change and has not recovered now in almost 4 years. It seems to me to be a noteworthy event.
UPDATE: Thanks to Leif Svalgaard, we have a more extensive and “official” Ap dataset (NOAA’s SWPC has issues, see comments) that I’ve plotted below. The step change in October 2005 is still visible and the value of 3.9 that occurred in April of this year is the lowest for the entire dataset.

And I’ve also plotted the 1991 to present data from BGS/Svalgaard to compare against the NOAA SWPC data:


Leif Svalgaard (06:11:08) :
Once again it is your data. I see that others disagree. When you make statements that the HMF is the same during SC23 & SC13 it would be good if you stated that was your assessment and other reputable scientists might have different views.
btw.. My last response re Usoskin was lost in the moderator queue…you might have to read back.
Geoff Sharp (07:40:19) :
Once again it is your data. I see that others disagree. When you make statements that the HMF is the same during SC23 & SC13 it would be good if you stated that was your assessment and other reputable scientists might have different views.
On the graphs you can see also the data by Roulliard et al 2007 [our main competitors in this game]. It is clear that they agree with us about the magnitude of the HMF, so what other reputable scientists are you talking about?
IMHO, Livingston & Penn’s spots ‘will become invisible’ is equivalent to ‘are disappearing’.
Leif 6:02:31
Thank you. Elegantly spake.
==================
Geoff Sharp (01:24:03) :
You must think we are all stupid
Not all, just you know who.
Conservatively over 54 predictions there would be at least 35 that follow the Babcock-Leighton model theory.
And which would they be? I have studied them all carefully.
Hathaway is a firm Babcock believer, the summary of his and Wilson’s prediction reads “Fast meridional circulation
speed during cycle 22 leads to a strong solar cycle 24″
Their prediction is listed under the “physics” heading
I’m sorry, I missed that one [makes it five then]. The prediction Hathaway is most known for [using aa or IHV] does not rely on the B-L model.
The great majority of the predictions are from those in the Babcock camp.
Almost every solar physicist is in the Babcock camp. The issue was whether they used B-L for their predictions and most did not.
Usoskin’s work has been useful, but his summary is blindly ridiculous.
I don’t think so, and apparently neither does he or his co-workers, e.g. Solanki.
Fluff and twaddle, the point of the exercise is SUNSPOTS, there is nothing new going on here. The other information is interesting but only a distraction.
As I said in an upthread comment:
“Lots of people will make a lot of nonsense noise about this, along the lines of “but you said the SUNSPOT NUMBER would be 72″ and not [some willfully] appreciate that the predictions are about the magnetic field in the active regions and not about the visibility of spots.”
Here we see the first of those.
Leif Svalgaard (08:09:57) :
so what other reputable scientists are you talking about?
McCracken plus Lockwood, and who’s how many others that you have not referenced. This is a recurring problem as you seem to be the only scientist active in this forum.
Your comments on my Usoskin reply?
Leif Svalgaard (08:09:57) :
Geoff Sharp (08:32:06)
Your comments on my Usoskin reply?
Just read your reply..not at all convincing
Geoff Sharp (01:24:03) :
The great majority of the predictions are from those in the Babcock camp.
“Almost every solar physicist is in the Babcock camp. The issue was whether they used B-L for their predictions and most did not.”
The B-L model can accommodate a wide variety of outcomes, because much depends on the boundary conditions assumed. E.g. Memory time, circulation speed, diffusion coefficient, location of dynamo, etc. Things that are poorly known and therefore must be assumed or guessed. If one guesses wrong, the prediction comes out wrong. As simple as that. Not the fault of the model, but of the input data.
Leif Svalgaard (08:27:05) :
I’m sorry, I missed that one [makes it five then]. The prediction Hathaway is most known for [using aa or IHV] does not rely on the B-L model.
The great majority of the predictions are from those in the Babcock camp.
Almost every solar physicist is in the Babcock camp. The issue was whether they used B-L for their predictions and most did not.
If that were true it doesn’t say much for the B-L theory….you cant have your cake and eat it too. They follow the B-L theory but choose to use a different method to predict SSN….something amiss here?
gary gulrud (08:17:10) :
IMHO, Livingston & Penn’s spots ‘will become invisible’ is equivalent to ‘are disappearing’.
Which is irrelevant, because they will still be clearly visible in the UV [e.g. Ca II K-line] or in magnetograms, so are still there.
Geoff Sharp (08:45:33) :
They follow the B-L theory but choose to use a different method to predict SSN….something amiss here?
What is amiss is your understanding of the science. Since the boundary conditions in the model are poorly known, people look for other ways of predicting the cycle [i.e. not using the model with yet another wild guess at what the input should be], e.g. looking at other indicators of solar activity, like coronal brightness, power spectral analysis, etc. The wide spread shows that these other indicators ain’t no good. Better to stick with the B-L model as a few did, and to delineate clearly what the assumptions were, so that, if the prediction turns out wrong, we can cross that set of assumptions of the list of good ones, and learn something.
Leif Svalgaard (08:43:57) :
The B-L model can accommodate a wide variety of outcomes, because much depends on the boundary conditions assumed. E.g. Memory time, circulation speed, diffusion coefficient, location of dynamo, etc. Things that are poorly known and therefore must be assumed or guessed. If one guesses wrong, the prediction comes out wrong. As simple as that. Not the fault of the model, but of the input data.
And that is exactly the problem with the model. No matter what happens the model can explain it although when questioned it fails miserably . Pseudo-science in my view.
You have great observation skills …Rise above it and search for a meaningful answer…it not too late 🙂
Geoff Sharp (08:32:06) :
McCracken plus Lockwood, and who’s how many others that you have not referenced. This is a recurring problem as you seem to be the only scientist active in this forum.
Lockwood is a coauthor of Rouillard et al (2007) and so agrees with us. His old 1999 paper is no longer valid. McCracken was relying on the obsolete 1999 Lockwood paper. There are no others.
Your comments on my Usoskin reply?…
Just read your reply..not at all convincing
I’m not fishing for your acceptance or trying to convince you, just stating the facts.
Geoff Sharp (09:15:20) :
And that is exactly the problem with the model. No matter what happens the model can explain it although when questioned it fails miserably . Pseudo-science in my view.
That is because you do not know what science is, but obviously have an intimate interest in pseudo-science. When people suspected that there might be a planet outside of Neptune, they assumed that its distance would be about twice that of Neptune, because the other planets roughly followed that ‘rule’. There was no Pluto at the computed position [and Pluto was discovered by an exhaustive search instead], partly because the assumption was wrong. That does not mean that Newton’s laws failed. Just that the input was wrong. Same thing with B-L.
Leif Svalgaard (09:26:36) :
So when questioned you can only resort to ridicule and ad hominem. Pityfull.
Geoff Sharp (09:36:15) :
So when questioned you can only resort to ridicule and ad hominem. Pityfull.
“Usoskin’s work has been useful, but his summary is blindly ridiculous.”
I think I have given detailed explanations for everything; is there anything specific you need to have a more detailed explanation about? or do you agree that my explanations were sufficient to alleviate your concerns. If not, what specifically is still outstanding?
Leif Svalgaard (02:19:57) :
Having never before been observed, there is no speculation, only lines of thought. If we start tip-toeing around the tulips for fear of being speculative or upsetting the apple cart, then we shall surely bury any science beyond, and put it out of reach of discovery.
rbateman (10:30:02) :
Having never before been observed
Perhaps during the Maunder Minimum?
there is no speculation, only lines of thought. If we start tip-toeing around the tulips for fear of being speculative or upsetting the apple cart, then we shall surely bury any science beyond, and put it out of reach of discovery.
Speculation is fine, as long as one knows one is speculating. And sometimes speculation is necessary to make progress. So we speculate.
Leif Svalgaard (09:18:52) :
“Lockwood is a coauthor of Rouillard et al (2007) and so agrees with us. His old 1999 paper is no longer valid. McCracken was relying on the obsolete 1999 Lockwood paper. There are no others.”
THE RISE AND FALL OF OPEN SOLAR FLUX DURING THE CURRENT GRAND SOLAR MAXIMUM
M. Lockwood et al 2009
ABSTRACT. We use geomagnetic activity data to study the rise and fall over the past century of the solar wind flow speed V SW, the interplanetary magnetic field strength B, and the open solar flux F S. Our estimates include allowance for the kinematic effect of longitudinal structure in the solar wind flow speed. As well as solar cycle variations, all three parameters show a long-term rise during the first half of the 20th century followed by peaks around 1955 and 1986 and then a recent decline. Cosmogenic isotope data reveal that this constitutes a grand maximum of solar activity which began in 1920, using the definition that such grand maxima are when 25-year averages of the heliospheric modulation potential exceeds 600 MV. Extrapolating the linear declines seen in all three parameters since 1985, yields predictions that the grand maximum will end in the years 2013, 2014, or 2027 using V SW, F S, or B, respectively. These estimates are consistent with predictions based on the probability distribution of the durations of past grand solar maxima seen in cosmogenic isotope data. The data contradict any suggestions of a floor to the open solar flux: we show that the solar minimum open solar flux, kinematically corrected to allow for the excess flux effect, has halved over the past two solar cycles.
Invariant (03:40:55) “[…] the time integral […]”
Are you working with a moving integral (sliding fixed-width integration-window) or a running integral (cumulative from a fixed-anchor-point at the beginning of the series)?
–
Invariant (14:12:52) “I did not find the arctic surface data”
From …
http://climexp.knmi.nl/start.cgi?someone@somewhere
… look under “Select a field” and select “monthly observations”.
Then select the series you want (from the list) and hit the “Select field” button.
At the next step you can specify 70N-90N in the “Extract timeseries” box, demand at least 0% valid data points, and hit the “Make time series” button.
You will then be able to choose the “raw data” hyperlink beside the resulting graphical summaries to get numbers. (Note that the 3rd graph is “anomalies”.)
Invariant
I think you should look at Alan Cheethan’s page relating temperature and EMF. I found this evidence pretty graphic, formidable, and amazing.
Leif Svalgaard (10:43:45) :
It should be understood, but, to be very precise, the question is posed:
Lief, do you wish to enter into speculation on this?
Paul Vaughan (12:59:43) :
Lucy Skywalker (13:52:03) :
Leif Svalgaard (06:23:05) :
Keep working on it…
Thanks for all the help!
Now I have finally managed to compare and fit the time integrated magnetic field of the solar wind (HMF B) with the global temperature (HADCRUT3).
1. HMF B http://www.leif.org/research/HMF-1835-now.xls
2. HADCRUT3 http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt
Using a least squares method (Matlab lsqnonlin function) I found that the best fit from 1850 to 2009 to be
T(t) = 0.007640[T1(t)-5.7848] + T0
Here T1(t) simply is the cumulative sum of HMF B, T0 the initial temperature given by HADCRUT3 and the two numerical constants was estimated by the program. The deviation between this curve and the real temperature is largest in 1910 and in 1940. Apart from that the fit looks reasonably good.
I have not really investigated whether another functional expression would work better, like square root or cube root, which sometimes may be the case in physics. I know very little about these things – I have just played a little with the data.
In Matlab the equation is
T_est = 0.007640*cumsum(HMF_B-5.7848)-0.4470;
According to the Matlab equation the temperature falls only 0.1225 degrees the next 10 years (2020) with a HMF B of 4.25.