As many readers know, I follow the Average Magnetic Planetary Index (Ap) fairly closely as it is a proxy indicator of the magnetic activity of our sun. Here is the latest Ap Graph:
I’ve pointed out several times the incident of the abrupt and sustained lowering of the Ap Index which occurred in October 2005.
click for a larger image
David Archibald thinks it may not yet have hit bottom. Here is his most recent take on it.

The low in the Ap Index has come up to a year after the month of solar cycle minimum, as shown in the graph above of 37 month windows of the Ap Index aligned on the month of solar minimum. For the Solar Cycle 23 to 24 transition, the month of minimum is assumed to be Ocotber 2008. The minimum of the Ap Index can be a year later than the month of solar cycle minimum, and the period of weakness can last eighteen months after solar cycle minimum.
The graph also shows how weak this minimum is relative to all the minima since the Ap Index started being measured in 1932. For the last year, the Ap Index has been plotting along parallel to the Solar Cycles 16 – 17 minimum, but about four points weaker. Assuming that it has a character similar to the 16 – 17 minimum, then the month of minimum for the Ap Index is likely to be October 2009 with a value of 3.
The shape of the Ap Index minima is similar to, but inverted, the peaks in neutron flux, which are usually one year after the month of solar minimum.
David Archibald
January 2009

Psi (16:02:49) :
Geoff,
Based on your tracking of this phenomenon, are you willing to make a global temperature forecast for the next 1-5 years?
I think David Archibald is on track. I can give you a trend for 1-5 yrs or I can give it for the next century. The next five yrs should definitely be cooler as we will most likely be in grand minima, if not, expect cooling like SC20. We should see a modest recovery around SC26 with continuing mild temperatures, like the early 1900’s. 2130 will see high sunspot activity greater than SC19 followed by similar cooling of SC20. After that I am not expecting grand minima for some time as the angles are weakening considerably (there is a slim chance around 2190) as we head into another MWP type era.
http://landscheidt.auditblogs.com/files/2008/12/ultimate_graph2all.jpg
I think that you guys are right about c/c++ being the best language of choice.
What about adding Cuda
http://www.nvidia.com/object/cuda_home.html
NVidias new way of using your graphics card to change your pc into a supercomputer with 128+ FPU processors.
Just a thought-
It would make a good facebook group. I’m sure that there are plenty of people willing to code/validate out there.
Pyromancer, it’s a poorly-named effect. It’s not a greenhouse at all, but someone named it that way and it stuck.
To TJ:
We live in similar settings. My home is situated in an open, virtually treeless plot of 16 acres, and it is positioned 700 ft from a rural paved highway that is seldom traveled at night. My home is located 2 miles north of a small town of circa 1,600 people, and beyond that at another 14 miles to the east is a city of circa 110,000. Last week my wife and I were in the city for an evening event, and we departed for home about 10:00pm. It was a clear night as a cold front had passed through the previous day. The temperature reading on my truck instrument was 43F as we traveled on the Interstate in the city. It takes about 22-23 minutes to drive home. During the drive, the temperature reading gradually declined to 34F midway between the city and the small town in definite rural surroundings, then rose to 36 F when we entered the small town. At the gate leading to my home, it was again 34F, but by the time we parked near the house, it was 32F. I went into the house immediately and checked my remote temperature sensor located 200 ft away from my home (and 900 ft away from the rural highway), and it registered 31F. The city has an official weather reporting station for NOAA (it has been surveyed); the temperature recorded at 10:00pm in the city was 45F. (Who says UHI isn’t real?) As the highway leading to my home is poorly traveled at night (maybe 20 cars per hour at 10:00pm), the effect was not as noticeable as in your case. I have compared city and home temperatures for years; the maximum difference between the readings occurs during winter following recent passage of a cold front when the sky overhead clears and the dewpoint drops (and the heat is free to radiate).
Re: nobwainer (Geoff Sharp) (18:01:48)
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Geoff, how about our planet’s albedo rising due to increased cloud cover? Can you rule it out?
TJ (17:56:33) :
BTW, it is -10F outside, we are looking at -20F for the second time in a little over a week and only the third time in my life. Based on my car thermometer, I think there is a different kind of UHI that even affects rural areas. Vehicle caused turbulence heating. … I think that the cold air settles in layers, with the dense, -20F air near the ground, and the warmer, -10F air maybe 10 or 20 ft above. As trucks and cars drive by, the turbulence brings down the warmer air to the highway.
It would be interesting to test if this effect accounts for much of the increase in temperatures in industrialized countries. Could it be simply tested by, for instance, setting up two temperature station alongside each other, in an area as near a road as many NOAA stations are, shielding one of the pair from the wind, and seeing if and when their temperatures diverge. If divergence is found, additional test pairs should be set up.
Joseph (19:47:28) :
Re: nobwainer (Geoff Sharp) (18:01:48)
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Geoff, how about our planet’s albedo rising due to increased cloud cover? Can you rule it out?
Its very possible, but if so, a product of lower solar activity. Its not all about TSI.
On the Fortran->C issue. The problem is not compiling the code to
another language, the problem is understanding what the code does.
Automatically generated C from Fortran would be harder to understand
than the original Fortran.
Like others I have been in the commercial software business for
decades and have used and taught dozens of languages. For portability I
suggest C, but the main problem is figuring out the algorithm. My
second choice for portability and speed would be object-oriented Perl,
a choice many will see as absurd, but it is FAST, easy to write, and
widely available. Heck, one could use FORTRAN, wrap it in a .COM
layer and express it many ways. The problem is documenting the algorithm.
I agree with the point about using a functional language. It is probably
the best approach, but has a serious drawback. So few people know
functional programming you can never convince anybody that the
program is valid, so nobody would trust the results. People would
still want the code in C.
Whatever language is used, visualization should be divorced from data
homogenization.
The code will execute quickly, whatever it is. There simply is not that
much data to crunch. A nice desk-side system today is a million times
faster than nice computer from a few decades ago. 16GB of RAM now
costs as much as a nice dinner with wine.
At SIGGRAPH this past August I spent time with the Google Earth
developers asking about an idea. I thought it would be great to have
a full-earth layer that showed inaccuracy in data recording for temperature.
No such GE layer exists, but it is possible to create one using a proxy,
or man-in-the-middle, approach.
The project vortex cars used to gather real time data for tornado genesis, are a good example of the vertical variability of the atmosphere. They had sensors placed to pull ambient air from well above the car to avoid local heating from the pavement and the cars engines.
http://en.wikipedia.org/wiki/File:NSSL_vehicles_on_Project_Vortex.jpg
http://www.windows.ucar.edu/tour/link=/earth/Atmosphere/tornado/vortex.html
I have also observed rapid localized micro weather effects while driving. In areas where you frequently see fog form, (very shallow depressions) it is not uncommon to see large temperature gradients over very short distances late at night and early in the morning when cold air has pooled in those depressions. This sort of micro weather was a major problem for folks producing plume distribution models for hazardous material incidents.
I was involved with the emergency response planning for the Rocky Flats facility (Plutonium processing facility) and they had multiple weather observation towers around the facility to track possible trajectory of any plume release. It was not at all uncommon for no two of the observation towers to record the same temperature and wind conditions.
Plume models were written that tried to accounte for the microweather caused by terrain effects that they had to deal with.
It was very difficult to achieve any reasonable ability to predict where or how fast a plume of toxic material would move off site in case of an incident.
http://www.ofcm.gov/atd_dir/pdf/trac.pdf
Reed Hodgin was the primary developer of the TRAC code as I recall.
It might be interesting to hear what his observations are regarding atmospheric modeling?
Larry
Ed (a simple old carpenter) (13:04:57) :
I would think this is basic stuff and the answers should be settled but there seems to be an argument about this simple concept.
I do not know if somebody has replied to your puzzlement.
The “misnomer” of the global “green house” effect is not disputed even by Real Climate followers. It is a mistaken analogy. The climate “green house” has nothing to do with the “tomatoes greenhouse”. It is a label that has been attached to the effect of various infrared absorbing gases, the most important of these being water vapor, in the atmosphere. The effect of these is real.
Empirically think of the desert, how dry it is and how cold it gets at night just because of this. Generally humid nights keep the heat in everywhere as anybody who has some curiosity about the world he lives in will know. That is the “greenhouse” effect in climate.
How this happens is another confusing story, but happens it does.
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
We don’t know. Nor do we know what caused the much larger climate shifts from inter-glacial to glacial phases, and back, of the (current) Ice Age.
Volcanoes are put forward as an explanation because they are the only natural forcing we know about that can effect the climate by that much. Although, no one can explain how volcanic eruptions can effect climate for the decades to centuries needed for the LIA and MWP , and the millenia to hundreds of thousands of years required for glacials/interglacials.
It’s certain that over century and longer scales there is a climate driver we (or at least mainstream climate science) know nothing about (Svensmark’s GCR theory is a good candidate).
Which means CO2 is only an important (to the extent it is important and that’s debateable) in the periods where the century-scale driver is quiessent, as was true in the 20th century. It remains to be seen if the same holds true for the 21st century.
So, while something has to have caused the LIA, MWP and glacial/interglacial cycles, I see no evidence it’s the energy output by the level of solar activity. If it is the sun, it has to be via some indirect effect such as albedo changes due to cloud cover.
Thanks to E.M. Smith, squidly, Ed Scott, cal and others for their very interesting and informative posts.
An excellent idea! A couple of days ago, shortly after President Obama made that statement, I sent the following to him:
The email address is comments@whitehouse.gov
[sarcasm on]
I’m sure he would be more than pleased to hear from the rest you should you feel so inclined.
[sarcasm off]
I don’t really expect anything to come of it, but if enough others also wrote … well, who knows?
@ur momisugly anna v
Confusing it may be, but the propaganda is wide and deep.
NCAR and a host of other “mainstream” institutions equate the ‘greenhouse effect’ to a real glass greenhouse.
http://www.ucar.edu/learn/1_3_1.htm
There is no shortage of this .
As to its effect, one has to wonder why the desert does in fact get very hot in the daytime with very little water vapor, but in the tropics temperatures do not appear to exceed a maximum threshold with excessive water vapor. Perhaps Miskolczi’s paper holds more more weight than his detractors care to acknowledge, and this is why NASA refused to publish his work.
Ross (22:16:15) :
In the interest of true scientific advancement, all published scientific theories and conjectures must be subject to review
Review by whom?
Douglas DC (08:27:36) :
Amazing !!!!!
Carsten Arnholm, Norway (17:06:51) :
Squidly (16:25:40) :
@ur momisugly Richard M (14:55:24) :
Fractals is the newest hot math to describe complex systems.
I am sure this just blew over heads LOL
oh well
FORTRAN is not a controlled language, C++ might be the way to go.
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Probably had to do with the lack of solar activity and the lineup of things that it leads to.
The Big Ice Ages got so cold that the CO2 (antifreeze) went out of the atmosphere into sequestration. They finally ended when sufficient extended solar activity warmed the oceans enough to release the CO2 antifreeze to keep the Ice Sheets from reforming during solar minima.
The Earth wants to play freeze out due to want of solar activity dropping off.
The Interglacials are really brief compared to the Ice Ages.
12,000 yrs out of 100,000 is a sharp reminder, and we see what a Modern Maximum can do to glaciers and how few and far between they are even in this interglacial.
What is being thrown out with the bathwater is effect of solar minimums.
Looking for the switch that caused Glaciation within a period of 1 year?
http://1965gunner.blogspot.com/2008/08/last-ice-age-happened-in-less-than-year.html
I am a C#/SQL programmer.
Frank Perdicaro (21:19:39) :
C# 3.x incorporates many ideas from functional languages: . Maybe this is the “best of both worlds.”? (Now I’ve gone and done it – here come the foaming-at-the-mouth “MS is the anti-christ” hordes…)
Squidly (16:18:54) :
Squidly, I am not grey-haired yet (just a tinge at temples), but I have been around long enough to participate in the pursuit of the programmer’s “holy grail”, OOPL’s, and even though I think you state a popular opinion among many non-academic programmers today, I would have to disagree with your statement. I took C++ in college, not COBOL, but I could see that Structural Programming already had many of the key elements OOP boosters want to claim as exclusive to OOPLs (for the non CS-geeks: OOPL=Object Oriented Programming Language).
Paul Graham started a little company that used Lisp to gain a competitive advantage: http://www.paulgraham.com/avg.html. It was bought out by Yahoo and he is now a venture capitalist. He (and others) are not impressed by claims of superiority for OOPLs: http://www.paulgraham.com/noop.html. (BTW: I would highly recommend Paul’s essays on a variety of subjects – I really enjoy his writing style, even if I don’t always agree with his opinions).
I use C# and there are many very nice aspects to it that I like a lot. However, I think the idea that it is just CS101 that complex problems must always be implement in an OOPL is just not correct. I am also a SQL programmer and I enjoy the declarative nature of SQL far more than writing loops in C#. SQL is far more readable and efficient for many types of complex problems (far more, in fact, than many OO programmers appear to realize since they often write complex, abstruse code in a C# client or Web Service that should be implemented as SQL queries IMHO).
And you thought AGW was contentious! Just look what a couple of simple suggestions for the “proper” language to use for a simple project can do when uttered in the presence of a few CS-geeks! (We went through the “language wars” a couple of years ago in our organization: Java vs C# – talk about stress!).
Re: Philip McDaniel (11:21:56)
You forget that not everyone is prepared to (or has the money to) pay for a windows license and a visual basic license. The inclusion of active X controls will also result in many people not being able to view the results correctly.
If the project ever gets of the ground then it should use a language that is well defined and available on multiple platforms.
Will people talking about computer languages and other irrelevancies go away please? This is a thread about solar activity.
Best Science blog it may be, but this blog needs better moderation. Maybe a chat room on the side.
I think David Archibald is on track. I can give you a trend for 1-5 yrs or I can give it for the next century. The next five yrs should definitely be cooler as we will most likely be in grand minima, if not, expect cooling like SC20.
Was there cooling during SC20? The mid-20th century cooling began in the 1940s. SC20 didn’t start until 1964 – around 20 years later. The end of SC20 (in 1976) actually signalled the start of the modern warming era.
Like a lot of the solar theory stuff – things just don’t add up.
It’s quite interesting this magnetic stuff I think! I’ve been tracking the ‘aa’ Index for the past few years. I keep an updated database on the Monthly value, which is from the International Geomagnetic Indices database: http://isgi.cetp.ipsl.fr/lesdonne.htm
The ‘aa’ Index posted a value of ‘9’ in November. December was back to ’12’ again. The November Monthly value ties with June of 1954 which also posted a ‘9’ Before that we have to go back to 1936 in which September also posted a ‘9’. So we are still joint bottom! To find a value below ‘9’ we need to go back further to November of 1927 which posts ‘8’.
Now, I’m sure Leif disputes the early database of the ‘aa’ Index. I’m sure he mentioned he thought it was underestimated. Might have remembered this wrong. But if so, then November 2008 might be more unusual that even before 1927.
Now, may be we should be looking at Annual averages here. After all it is The Sun. If we look at this, then 2008 as an Annual figure ties once again with 1965, both posting “14” Actually, 1965 was 13.75 and 2008 14.08, but they round to the same Integer, as I don’t believe we use decimals for Magnetic Indices. But could be wrong on that!
Previous to that, the years 1923 and 1924 both post Annual Averages of ’10’. So a long way back whatever.
Here’s a few plots anyway. Monthly plot since 1950, showing nicely the tick-down in 2005 as already shown on the Ap Index:
http://www.wacv.co.uk/charts/climate/solar_image.php?SelectSeries=aa&SelectTemp=None&SelectRes=Monthly&StartYear=1950&EndYear=2008
Annual since 1900, showing the Low level reached in 2008 since at least the 1920’s (and possibly further back if you don’t trust this earlier data)
http://www.wacv.co.uk/charts/climate/solar_image.php?SelectSeries=aa&SelectTemp=None&SelectRes=Annual&StartYear=1900&EndYear=2008
You can explore the ‘aa’ database at the following link. Select “Solar Series” and then ‘aa’ from the pull down and choose Monthly/Annual and the date range.
http://www.wacv.co.uk/index.php?option=com_wrapper&view=wrapper&Itemid=23
Further to the ‘what programming language is best’ debate. I think it’s pretty pointless trying to re-write the GISS code because you’d never be able to prove that your re-write was functionally the same as the original. I’d be far more interested in someone doing some thorough black box testing on the code as it is, to find out if it actually works correctly. If you could prove that the code always produces results with a warm bias, then you’d really put GISS on the spot.
Philip_B (21:54:21) :
We don’t know. Nor do we know what caused the much larger climate shifts from inter-glacial to glacial phases, and back, of the (current) Ice Age.
Is that right?…I thought the IPCC was quite comfortable with the Sun as a driver before 1950. Milankovitch cycles every 100,000 years very clearly explain our ice age cycles…its a simple thing, earth’s orbit moves from round to elliptical.
Volcanoes are put forward as an explanation because they are the only natural forcing we know about that can effect the climate by that much.
So we know everything and there cant be anything else? But once again you forget the Sun and the acceptance of solar forcing before 1950. The ENSO pattern might just be another, and then we have the earth’s albedo and dont forget the emerging area of UV.
Although, no one can explain how volcanic eruptions can effect climate for the decades to centuries needed for the LIA and MWP , and the millenia to hundreds of thousands of years required for glacials/interglacials.
I think you need to provide some links showing us how volcanoes have anything to do with past ice ages. Mt Tambora was one of the biggest eruptions in recent times during the Dalton. It was reported to have a very generous SO2 content. Ice core records only show a period of 2-3 yrs total time in the atmosphere, that is a long way short of your statement.
http://en.wikipedia.org/wiki/File:Greenland_sulfate.png
All other volcanoes including Krakatoa since the late 1800’s show very minimal to nil change in the world GISS records. Volcanoes have a very short term effect on climate.
So, while something has to have caused the LIA, MWP and glacial/interglacial cycles
These arguments are often heard from AGW supporters who discredit the Sun and hold volcanoes up as an answer…. heard it several times before in here.