Archibald makes an Ap Index prediction

As many readers know, I follow the Average Magnetic Planetary Index (Ap) fairly closely as it is a proxy indicator of the magnetic activity of our sun. Here is the latest Ap Graph:

I’ve pointed out several times the incident of the abrupt and sustained lowering of the Ap Index which occurred in October 2005.

click for a larger image

David Archibald thinks it may not yet have hit bottom.  Here is his most recent take on it.

archibald_ap-index
click for larger image

The low in the Ap Index has come up to a year after the month of solar cycle minimum, as shown in the graph above of 37 month windows of the Ap Index aligned on the month of solar minimum. For the Solar Cycle 23 to 24 transition, the month of minimum is assumed to be Ocotber 2008. The minimum of the Ap Index can be a year later than the month of solar cycle minimum, and the period of weakness can last eighteen months after solar cycle minimum.

The graph also shows how weak this minimum is relative to all the minima since the Ap Index started being measured in 1932. For the last year, the Ap Index has been plotting along parallel to the Solar Cycles 16 – 17 minimum, but about four points weaker. Assuming that it has a character similar to the 16 – 17 minimum, then the month of minimum for the Ap Index is likely to be October 2009 with a value of 3.

The shape of the Ap Index minima is similar to, but inverted, the peaks in neutron flux, which are usually one year after the month of solar minimum.

David Archibald

January 2009

0 0 votes
Article Rating
175 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
sdk
January 24, 2009 12:10 am

does / has our star ever ‘switch[ed]’ poles, as our planet has ? I don’t have a good feeling for/about our planets inhabitants if this event were to occur.
in any case, tend to agree, those just born and living long enough may be in for a gradual surprise.

peter vd berg
January 24, 2009 12:29 am

I double post this from solar ramp 24, didn’t see this post before.
In the earlier discussion on the solar ramp 24 i gathered that the best what can be done is reconstitute about 10000 years of solar activity.
Forgot the exact age of the sun, but say it’s 4x 10e9 years.
That makes for an observed period of 0.0000025% of the total.
I have a hard time keeping track of my pocketchange but even i can see that’s pretty flimsy to build whatever kind of prognosis/theory on even taking into account what is assumed about the general dynamics of a star’s life.

Adam Gallon
January 24, 2009 1:19 am

We have geomagnetic “fossil” records for magnetic flipping of the poles, but obviously there’s no similar record for solar flipping.

Robert Bateman
January 24, 2009 2:48 am

Then I was being generous in assigning the first good sunspot of SC24 to be Sept 22, 2008. I had SC 24 ramp from late Sept. 2009 to late May, 2010.
That’s a deep hole to climb out of, no matter which indicator one is using.
http://www.nwra-az.com/spawx/ssne-cycle23.html
NWRA’s effective Sunspot SC23 graph also showing a deep hole dug by the tardy SC24.

DaveK
January 24, 2009 3:17 am

Um… I thought that “flipping” of the sun’s magnetic field was part of the regular solar cycle? That is, it happens about once every 22 years.
Please correct me if I’m wrong about that.

Editor
January 24, 2009 3:41 am

Is the ap index still dropping like this ‘a bad thing’, a ‘really bad thing’ or a ‘gee, intersting…’ thing? Just eyeballing the graph and with the statement that it’s a proxy for solar output, I feel like buying some longjohns…
Then there is this head scratcher:
Well I’ll be. James Lovelock, the greens green and creator of Gaia mythology, agrees that carbon trading is a waste of time! I’ve softened a couple of his words a bit (my edits are in [square brackets] in the quote) to save the moderator a ‘snip’…
From:
http://www.newscientist.com/article/mg20126921.500-one-last-chance-to-save-mankind.html
Not a hope in [heck]. Most of the “green” stuff is verging on a gigantic scam. Carbon trading, with its huge government subsidies, is just what finance and industry wanted. It’s not going to do a [darn] thing about climate change, but it’ll make a lot of money for a lot of people and postpone the moment of reckoning. I am not against renewable energy, but to spoil all the decent countryside in the UK with wind farms is driving me mad. It’s absolutely unnecessary, and it takes 2500 square kilometres to produce a gigawatt – that’s an awful lot of countryside.
NOAA & GISS:
And finally, I’ve done a first pass through the NOAA data and the GISS code. I’m still figuring out what it all means (table of variables with description? You’ve got to be dreaming. Comments? OK, how about one cryptic one per program?) At this point though, my ‘first blush’ is that NOAA has the false precision problem. They hand over ‘monthly mean’ data in 1/100 degree C precision. I don’t see how that is even remotely possible.
It also looks (per the terse readme) like GISS uses the UHI unadjusted NOAA data set rather than the adjusted one (though it is a manual download – easy ftp! – so anyone could use any dataset at the time of running the code. In the ‘readme’ the GHCN and HCN station description files have the .Z ending confounded. The readme for one said to use it (when it was missing) the other said not (when it was there). Hope this isn’t a trend.
Finally, it looks like all GISS does is glue together the HCN, GHCN, and antarctic data (plus some small bits) with some removal of dups and ‘preening’ then does the magic UHI homogenization dance, and some final formatting/cleaning. So that would lead me to believe that a simple cross check dataset can be made by taking the NOAA UHI adjusted data directly and doing station to station comparison graphs.
From the GISS Readme:

GISS Temperature Analysis
=========================
Sources
——-
GHCN = Global Historical Climate Network (NOAA)
USHCN = US Historical Climate Network (NOAA)
SCAR = Scientific Committee on Arctic Research
Basic data set: GHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2
v2.mean.Z (data file)
v2.temperature.inv.Z (station information file)
For US: USHCN – ftp://ftp.ncdc.noaa.gov/pub/data/ushcn
hcn_doe_mean_data.Z
station_inventory
For Antarctica: SCAR – http://www.antarctica.ac.uk/met/READER/surface/stationpt.html
so a simple ftp window on the data location, in your browser, and you can get the UHI adjusted data (though the fahr implies F) and compare to GISS to see what he’s doing. Or just use the same dataset, he uses, not UHI adjusted by anyone…
From the HCN Readme:
urban_max_fahr.Z Urban Heat Adjusted Maximum Monthly Temperature
urban_calc_mean_fahr.Z Urban Heat Adjusted Mean Monthly Temperature
(Calculated from urban.max.Z and urban.min.Z) urban_mean_fahr.Z Urban Heat Adjusted Mean Monthly Temperature
urban_min_fahr.Z Urban Heat Adjusted Minimum Monthly Temperature

For some unknown reason, GISS break the processing down into steps 0,1,2,3,4-5. The start and end are FORTRAN, but step 1 is python (with some C bits to compile and install on your machine). Go figure… The good news is that the PApars.f chunk in Step2 that does the pasteurizing process is the one bit of code that does have decent comments in it.
The code is not particularly complex. It has oodles of undocumented variables and many scratch files, especially between steps, so decoding it will take a bit of work. My estimate is that the code could be shrunk by about 60% with no loss of function.

Mary Hinge
January 24, 2009 3:59 am

Adam Gallon (01:19:24) :
We have geomagnetic “fossil” records for magnetic flipping of the poles, but obviously there’s no similar record for solar flipping.

The first thing that came to my mind after reading this article was the points mentioned above, a ‘Solar Flip’. Is it possible that this could be a precursor to an ‘Earth Magnetic Flip’ and is so how could you test this. I am no solar scientist so I hope Leif can give some pointers here.

January 24, 2009 5:00 am

In nearly 12 months we hit the spot where the solar system exerts its maximum disturbing influence on angular momentum on the Sun, its a strange situation. Normally it would be expecting a very low angular momentum count, but there are 2 planets ganging up on Saturn that have other ideas…and it happens on a regular basis on avg every 172 yrs, we have records showing this for at least 6000 years. This also lines up with grand minima nearly every time for the same period.
Here’s a graph showing that disturbance….follow the green arrow at 2010.
http://landscheidt.auditblogs.com/files/2008/12/sunssbam1620to2180gs1.jpg

Alan the Brit
January 24, 2009 5:09 am

Sdk/Dave K:-)
I am no solar scientist either, but the magnetic field reverses every 11± year Schwabe cycle & flips back again after the 22± year Hale cycle. I am sure the likes of Lief Svalgaard can direct you to good sites that give the basics in lay terms.
However, that is an interesting point, do the flips back & forth in the sun have at some stage for whatever reason, affect the Earth’s magnetic flip, I seem to recall that this occurs every few hundred thousand years or so, again better brains than mine can advise of this phenomenon!
I read an article by geologist Texan Gregory Benson a few years ago explaining all the likely causes of Climate Change, hot or cold, without any need for CO2 involvement, including some very interesting data on Solar variations. Well worth a good read for lots of interesting background info for anyone interested in further informal study!

Alan the Brit
January 24, 2009 5:10 am

Sorry that should have read ‘Texan Geologist’ not the way it was typed.

January 24, 2009 5:12 am

The observed Ap continued to fall in November and December 2008 as predicted from the above, and this correlates with the previous posted article concerning cosmic rays. More than a coincidence that the globe is cooling? Svensmark’s theory should be on the front pages as the first real theory on climate change — instead of this CO2 political science theory.

Pat
January 24, 2009 5:18 am

This is a great information and discussion site, love it and thanks to all. I have studied chemistry and planetary science for many years, and have “followed” the CO2 climate change hype (Yes, and all of the other doomsayer tersm in the past) since the late 60’s. The CO2 frocing does not add up, and only that from humans over ~150 years, ~100ppm CO2.
Sea ice growing in the actic, Labrador in particular, sea ice shrinking in the antarctic. Errrmm….
Interglacial periods are warmer, Al “I invented the internet” Gore.
From real data, not IPCC speak, better get nitting, it’ll be cold “soon”!

January 24, 2009 5:24 am

Since Lief has not shown up yet. The solar magnetic field flips regularly every 7.5 to 15 years. The last time the Earth’s field flipped was about 800 Kyears ago. The state of the Sun and its magnetic field is interesting, but not alarming. It may lead to a grand minimum, but the change in Total Solar Irradiance is small so don’t count on a solar driven little ice age.

Roy
January 24, 2009 5:58 am

Re http://www.nwra-az.com/spawx/ssne-cycle23.html posted above: what could a negative sunspot number possibly mean (with reference to the “light” line)?

Editor
January 24, 2009 6:00 am

sdk (00:10:53) :

does / has our star ever ’switch[ed]‘ poles, as our planet has ? I don’t have a good feeling for/about our planets inhabitants if this event were to occur.

The poles flip every solar cycle, so you’ve lived through several already.
—-
Just so save Leif the time – that October 2005 drop is no big deal and similar events have happened frequently in the past. OTOH, I have no idea why that happens. It looks interesting, it might be interesting, so I figure that’s why Anthony points it out every time there’s a reason to show the Ap plot.

pochas
January 24, 2009 6:20 am

nobwainer (5:00:07)
“In nearly 12 months we hit the spot where the solar system exerts its maximum disturbing influence…”
When will this influence become apparent? Instantly? What is your vision of the process taking place on/in he sun that generates sun spots? Are there no time lags involved?

January 24, 2009 6:48 am

I’m impressed by the fact that even the short term variation is smoother. It really looks quiet compared to the rest of the record. It’s too bad we don’t have more time in the record. I think Leif and others may be right about cycle 24 being very weak.
Off topic. Real climate just stated that models predicted Antarctic warming all along. They already knew, those guys are so smart. In February 08 they said the same thing about cooling in the antarctic.
http://noconsensus.wordpress.com/2009/01/24/real-climate-doubletalk-blog-food/

January 24, 2009 6:56 am

Hey, look at the bright side!
Regardless of the current trend, and, regardless of how long the current (solar) depression lasts, at least the API – (and unlike the DOW) – can never go negative!
Heck, the sunspot number is already 0.0 That can’t get any lower either.

Harold Ambler
January 24, 2009 7:09 am

Dear E.M. Smith:
Thank you for doing this important work! I would like to be able to e-mail you questions about your progress and findings for the book I’m researching on the climate wars. The safest way for me to get you my e-mail address would be if you could leave a comment (“Hello” would be enough) on my weather and climate blog (http://www.talkingabouttheweather.com). Thank you for your consideration!

Ed Scott
January 24, 2009 8:27 am

Glacier Slowdown in Greenland: How Inconvenient
http://www.worldclimatereport.com/index.php/2009/01/23/glacier-slowdown-in-greenland-how-inconvenient/
Maybe Gore will go back and remove the 12 pages worth of picture and maps from his book showing what high profile places of the world will look like with a 20-foot sea level rise (“The site of the World Trade Center Memorial would be underwater”). But then again, probably not—after all the point is not to be truthful in the sense of reflecting a likely possibility, but to scare you into a particular course of action.

Douglas DC
January 24, 2009 8:27 am

When my late Mother passed,we had a beautiful funeral in LaGrande,Oregon.The Organist at her Church, is a well known Warmist.It was October of ’05.We were walking
across the parking lot,on a golden fall day.She said:”Sure is warm anymore!” I said,
dang near on the day the Ap dropped,”Yeah but,this is a cycle,I think the Solar cycle is
going to quiet down.”-this after reading that paper on the slowing solar conveyor-She
looked at me like-HERETIC!-and said:”You haven’t read Algore’s book!” I-“No, only excerpts”-then she stormed away,saying”-#%$&- Republicans!”-oh I’m not a Republican BTW…
Then she hit me for $75 bucks for her Organ wizardry…

January 24, 2009 8:49 am

All this talk about the sun controlling Earth’s climate, you realize you are inflating it’s ego. Who would believe you that the sun, the solar system and maybe the universe is in charge of Earth’s climate, when we all know it’s all the animals on the planet exhaling. You should be ashamed of getting the sun all pumped. Who knows what the sun might do next, to just drive the point home.
And isn’t that the real problem science should be dealing in? FACTS …
E.M.Smith — Why not propose an open source rewrite of the data reduction program? There must be enough programming types here to do that, in short order. And why not use PYTHON or BASIC, something almost everyone can read? I have often wondered why not just do this in the open for ourselves. Hey, someone could write a book documenting the process and code — LOL.

mark wagner
January 24, 2009 9:11 am

It may lead to a grand minimum, but the change in Total Solar Irradiance is small so don’t count on a solar driven little ice age.
uhm….regardless the cause, there is significant historical evidence that grand minima DO result in a cooler earth.

Ed Scott
January 24, 2009 9:15 am

Falsification Of The Atmospheric CO2 Greenhouse Effects Within The Frame Of Physics
http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.1161v3.pdf
The atmospheric greenhouse effect, an idea that authors trace back to the traditional works of Fourier 1824, Tyndall 1861, and Arrhenius 1896, and which is still supported in global climatology, essentially describes a fictitious mechanism, in which a planetary atmosphere acts as a heat pump driven by an environment that is radiatively interacting with but radiatively equilibrated to the atmospheric system. According to the second law of thermodynamics such a planetary machine can never exist. Nevertheless, in almost all texts of global climatology and in a widespread secondary literature it is taken for granted that such mechanism is real and stands on a firm scientific foundation. In this paper the popular conjecture is analyzed and the underlying physical principles are clarified. By showing that (a) there are no common physical laws between the warming phenomenon in glass houses and the fictitious atmospheric greenhouse effects, (b) there are no calculations to determine an average surface temperature of a planet, (c) the frequently mentioned difference of 33 degree C is a meaningless number calculated wrongly, (d) the formulas of cavity radiation are used inappropriately, (e) the assumption of a radiative balance is unphysical, (f) thermal conductivity and friction must not be set to zero, the atmospheric greenhouse conjecture is falsified.
————————————————————-
SUMMARY: Falsification Of The Atmospheric CO2 Greenhouse Effects Within The Frame Of Physics
http://www.tech-know.eu/uploads/Falsification_of_the_Atmospheric_CO2_Greenhouse_Effects.pdf
Among climatologists, in particular those affiliated with the Intergovernmental Panel of Climate Change (IPCC), there is a “scientific consensus” that the relevant climate mechanism is an atmospheric greenhouse effect, a mechanism heavily reliant on the presumption that radiative heat transfer dominates over other forms of heat transfer such as thermal conductivity, convection, condensation, et cetera. Supposedly to make things more precise, the IPCC Introduced the notion of radiative forcing, tied to an assumption of radiative equilibrium.
“…applying cavity radiation formulas to the atmosphere is sheer nonsense.”
“CO2’s influence on the Earth’s climate is definitively immeasurable.”
““Hence, the computer simulations of global climatology are not based on physical laws. The same holds for the speculations about the influence of carbon dioxide.”
“The natural greenhouse effect is a myth, not a physical reality. The CO2-greenhouse effect, however, is a manufactured mirage. Horrific visions of a rising sea level, melting pole caps and spreading deserts in North America and Europe are fictitious consequences of a fictitious physical mechanism which cannot be seen even in computer climate models. More and more, the main tactic of CO2-greenhouse gas defenders seems to be to hide behind a mountain of pseudo-explanations that are unrelated to an academic education or even to physics training. The points discussed here were to answer whether the supposed atmospheric effect in question has a physical basis. It does not. In summary, no atmospheric greenhouse effect, nor in particular a CO2-greenhouse effect, is permissible in theoretical physics and engineering thermodynamics. It is therefore illegitimate to use this fictitious phenomenon to extrapolate predictions as consulting solutions for economics and intergovernmental policy.”

sdk
January 24, 2009 9:31 am

my vote would be to use C, and a procedural approach ! the results would be quite readable and maintainable, if coded from pseudo code viewed that way.

sdk
January 24, 2009 9:33 am

and the machine code would be MUCH more understandable if one had to ‘dig’ for optimizing . also, the C to asm interface is quite nice, for those routines that crunch numbers and therefore must be opt.

squidly
January 24, 2009 9:46 am

E.M.Smith — Why not propose an open source rewrite of the data reduction program? There must be enough programming types here to do that, in short order. And why not use PYTHON or BASIC, something almost everyone can read? I have often wondered why not just do this in the open for ourselves. Hey, someone could write a book documenting the process and code — LOL.

Actually, once I have completed a rather large project I am engaged in at work, I should have a bit more free time on my hands and I would like to attempt such a task as you speak of. However, my goal is to rewrite their model in Java, create a web interface for it and publish it on the Internet for everyone to play with as they please. I have begun to organize the code in anticipation of proceeding such an undertaking. Please be patient however, as this will take some effort and I am typically a rather busy individual. But, this does excite me. Welcome to my world. 😉

RK
January 24, 2009 9:55 am

I think Al Gore wins no matter what. If this AGW theory turns out to be bogus (which seems to be), his notoriety surely will rise to rival Mr. Ponzi’s and outlives his mortal time on earth. Gorified science?

Robert Bateman
January 24, 2009 10:13 am

All the wind generators in the UK mentioned above remind me of the stunt that was put on the ballot in California. It got a resounding boot by the general vote, the Dem and the GOP parties for it’s obvious drawback: A huge landgrab resulting in one man playing Eminent Domain God.
Not to worry about the public getting mislead: Al Gore is responsible for grabbing attention to the climate. Others such as David Archibald hand the public what it needs to sort out what is happening around them in a manner befitting true Science.
Observe and consider.
Observe the descending cold.
Consider what is going on and compare it to past events.
Come to a conclusion not driven by those who wish to profit from our misfortune.

Robert Bateman
January 24, 2009 10:15 am

Al Gore has suceeded in drawing attention to the Earth’s Climate.
Inquiring minds by the billions will figure things out for themselves.
Just give them both sides of the Equation.

squidly
January 24, 2009 10:15 am

sdk (09:31:49) :
my vote would be to use C, and a procedural approach ! the results would be quite readable and maintainable, if coded from pseudo code viewed that way.

If you are going down the C road, I would recommend C++ and write it in a modular / OOP architecture, especially if you are looking to involve multiple developers. Management of the project would be much easier and coding could be much more concise and require far less documentation while maintain clear readability.
My personal preference would be to write it in Java. Although the performance would not be quite as good (probably still better than Fortran/Python however), it would be a lot easier to publish on a J2EE server, create a java servlet filter, create a web interface and allow the public to play with the model. This may be a pie in the sky idea, as it could potentially require horsepower that I do not possess, but it may be worth a try anyway.
I personally believe that there should be a legal mandate (and perhaps there is) that would require Hansen’s team (and any other publicly funded projects) to create their models in such a fashion, and require them to do such publishing, so that the public (their employers) could actually use the software that we keep paying for. I have always thought it to be incredibly ridiculous that these guys have been able to hide a lot of this stuff from us. I believe (although I am not of legal background) that if one really investigated thoroughly, one would find that Hansen and others have been violating several laws concerning full disclosure of this kind of research, paid for with tax payer dollars. Hansen, et al, have not historically been forthcoming. What’s Obama’s new buzz word? “Transparency”

January 24, 2009 10:21 am

I am curious about solar magnetic influence on climate.
Solar magnetic field strength impacts TSI to some extent.
Also, the Svensmark effect is another mode of impact.
But I am curious as to what is the contribution to the
earth’s energy budget from solar magnetic field strength
in the form of geo-magnetically induced currents(GICs)?
http://www.agu.org/pubs/crossref/2008/2008SW000388.shtml
http://en.wikipedia.org/wiki/Geomagnetically_induced_current
The number is not zero, but is it significant?
The solar magnetic field interacts with the earth’s
magnetic field which in turn induces electric current
in the earth (and oceans). Electric current flowing
through some resistance generates heat.
This is the same principle used for induction cook stoves
now becoming popular.
The magnetic fields and electric currents are small,
but the effect is global. I know the effect is non-zero,
but could it be on the order of 1 W/m^2? 0.1W/m^2?

Adam Gallon
January 24, 2009 10:22 am

Ooops, there I go, mouth opens, both feet rammed firmly in!
Sun merrily flipping its magnetic poles.
Ed Scott, err, there is a Greenhouse effect inthe earth’s atmosphere, otherwise we’d be at a balmy 15 below zero C
“The natural greenhouse effect is a myth, not a physical reality. The CO2-greenhouse effect, however, is a manufactured mirage”
A mountain of male bovine excrement there old chap.
Few scientists would argue that the greenhouse effect doesn’t exist, the debate is to what extent AGW exists.

Editor
January 24, 2009 10:37 am

The shape of the Ap Index minima is similar to, but inverted, the peaks in neutron flux, which are usually one year after the month of solar minimum.
So…. Could the history of neutron flux via some geologic proxy be used to ‘reconstruct’ the probable Ap index into the past? And could that be used to validate the sunspot minima dates during times of poor observations or missing spots (like grand minima or very long ago…)?
Or is it just too many degrees removed from reality with no suitable proxies?

Syl
January 24, 2009 11:01 am

E.M.Smith
Good work!
“So that would lead me to believe that a simple cross check dataset can be made by taking the NOAA UHI adjusted data directly and doing station to station comparison graphs.”
Probably not, unfortunately. NOAA does not actually make UHI adjustments to the data, but make an allowance in uncertainty instead, from what SteveM has been able to find.
http://www.climateaudit.org/?p=4901

hotrod
January 24, 2009 11:12 am

E.M.Smith — Why not propose an open source rewrite of the data reduction program? There must be enough programming types here to do that, in short order. And why not use PYTHON or BASIC, something almost everyone can read? I have often wondered why not just do this in the open for ourselves. Hey, someone could write a book documenting the process and code — LOL.

I agree and open source re-write then let them tell us if or where your code varies from the original source code. If they don’t protest the new coding (with ample comments for future reference) it would be a service to the entire debate and a demonstration of how good scientific coding should be done.
It also might highlight some questionable issues in the original code such as false precision as mentioned above or other manipulations that will not stand up to open analysis.
It is a left handed way to force them to document their code, since they refuse to give proper documentation of the how and why in their processes.
Larry

hotrod
January 24, 2009 11:20 am

It might also lead to some enterprising computer science majors who might want to start an open source project, on the model codes themselves, to at least comment the code and analyze the code blocks and what they are doing.
I imagine there are several masters thesis sitting there for the taking, for math majors and computer science majors, and physics majors to dig into the major climate models and produce annotations and analysis of their methodology. Open source documentation of how they work, and the limits of their accuracy from a pure mathematical, physical, and statistical point of view would be invaluable to the world community.
Larry

Philip McDaniel
January 24, 2009 11:21 am

“E.M.Smith — Why not propose an open source rewrite of the data reduction program? There must be enough programming types here to do that, in short order. And why not use PYTHON or BASIC, something almost everyone can read? I have often wondered why not just do this in the open for ourselves. Hey, someone could write a book documenting the process and code.”
Try Visual Basic…if Microsoft hasn’t changed it beyond all recognition. I used it several years ago to build some programs that analyzed Diesel engine performance. The language allows inclusion of Active X controls, of which there were many build by independent programmers and companies. Some of these controls were pretty sophisticated graphing routines. There were even fuzzy logic and neural net controls; I played around with some of this to see if I could detect time to failure on an engine component. Visual Basic was easy to write in and debug and the compiled programs ran pretty fast—fast enough to actually control a Diesel engine via remote radio over about 5 miles with complete safety shutdowns. Version 6 should be still around and fairly cheap, likewise various Active X controls. Of course, this is a Windows based tool and the programs compile (using Microsoft’s C compiler engine) to run in Windows.

January 24, 2009 11:25 am

sdk (09:31:49) :
my vote would be to use C, and a procedural approach ! the results would be quite readable and maintainable, if coded from pseudo code viewed that way.

C++, perhaps used as “a better C” to keep it simple would be my choice. Much better support for strings and other useful things like containers and classes that improve readability enormously. But C and C++ goes well together. It is also possible to mix C++ with existing Fortran if needed, I wrote a tutorial on it a decade ago
http://arnholm.org/software/cppf77/cppf77.htm
An open source library for climate data processing is indeed a good idea.

January 24, 2009 11:45 am

“Falsification Of The Atmospheric CO2 Greenhouse Effects Within The Frame Of Physics”
http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.1161v3.pdf
Ed, that is a very technical paper and difficult to follow. I think the argument is put in layman’s terms nicely by Essex and McKitrick in their revised edition of “Taken by Storm.”

Logan
January 24, 2009 11:59 am

Ed Scott — thanks. The Gerhard Gerlich and Ralf D. Tscheuschner paper summary should be read by all here. At this time the warmists claim that most scientific societies endorse the greenhouse garbage. One would hope that physical research groups would be the first to defect to the rational side if the AGW theory (or propaganda) is as fundamentally flawed as G &T claim. And, if there is someone here who is qualified to critique G & T, lets see it.
Perhaps readers of this blog can help to promote a high level debate at the national academy of sciences here and its equivalents wordwide. So far, the AGW people have essentially disorganized and unfunded opposition, while the warmists have multi-millions in grants and media support. They are, in a sense, winning by default. High level critiques might help, but the ultimate rebuttal may come from nature over the next several years.

January 24, 2009 12:16 pm

Re the effect, if any, of atmospheric CO2 on the climate.
I posted this link a few weeks ago, but it may be useful here again.
A prominent PhD and Professional Chemical Engineer, Dr. Pierre R. Latour, wrote on this from a process control viewpoint (see link below).
My earlier post was criticized by some because Dr. Latour supposedly is paid by “the oil companies,” and that makes his opinion invalid. Yes, Dr. Latour has worked for oil companies and others, as have I for many years. However, the control principles he states, the engineering and physical principles he states, are not debatable. They are applicable and applied in many industries, not just oil.
To sum up his argument, to control a process one must choose a manipulated variable that is independent, and has a measurable and consistent causal effect on the variable one wants to control. CO2 in the atmosphere, whether natural or man-made, meets none of the criteria.
Or, in my native cowboy-speak, “If yall are gonna control the global temperature, yall better git a better handle than CO2. That dog won’t hunt.”
http://www.hydrocarbonprocessing.com/index.html?Page=14&PUB=22&SID=715446&ISS=25220&GUID=D5E78EC9-18EA-4C76-9A48-D4D9279140FB
and scroll down to “Author’s Reply.”
Roger E. Sowell
Marina del Rey, California. Where it is raining yesterday and today. And a 3.4 earthquake was centered 3 miles offshore yesterday at 7:43 p.m. local.

Ed (a simple old carpenter)
January 24, 2009 1:04 pm

I don’t understand half of whats is said here at this blog but I keep reading and learning little by little a bit more each day.
And I can’t stop reading about this stuff, it’s an addiction.
Now my latest confusion is on this greenhouse effect, I have read laborisly though some papers arguing that there is no such thing as the greenhouse effect outside of a real greenhouse. To sum up as best as I can, in a greenhouse the sun heats the objects and air, the hot air rises but is trapped because of the glass. But outside it is different, the sun heats objects and the air, the heats rises and continues to rise because there is nothing to stop it, maybe slow it down but no stopping. After all heat flows from hot to cold and if it is colder at higher altitutes than heat must continue to rise until it escapes our atmosphere.
So where’s the greenhouse? I’m confused
I would think this is basic stuff and the answers should be settled but there seems to be an argument about this simple concept.

pyromancer76
January 24, 2009 1:27 pm

Ed Scott:
Thanks for the reference to “The Atmospheric CO2 Greenhouse Effects Within the Frame of Physics”. In contrast to Dave L, I think this physics is essential for most of us who read WUWT, but I can only appreciate it from a non-scientist perspective. In particular, it seems to me that the history of the idea of the atmosphere acting “as a heat pump driven by an environment that is radiatively interacting with but radiatively equilibrated to the atmospheric system”, i.e., the green house effect”, goes back to 1824. From the perspective of the historian, 1824 to the present has given us many years for a mistaken idea to permeate many levels of society. To a gardner a greenhouse is a lovely, intuitive idea for our beautiful earth and its atmosphere.
As a constant reader of WUWT, I would appreciate further information about the physics. The paper emphasizes “that until today the ‘atmospheric greenhouse effect’ does not appear:

Robert Bateman
January 24, 2009 1:28 pm

The general public is going to understand this to the extent of that which they are able to follow. For those that connect with them, there will be increasing support.
For those that bury everything in deep heiroglyphical mumbo-jumbo and secret models known only to a select few, there will be nothing but scorn and dismissal.
Thier senses tell them something is going down, and it’s not what mainstream is telling them.

len
January 24, 2009 1:31 pm

Thanks Ed Scott,
I was just going to look for some real discussion about CO2 as a GHG. I know from documented exchanges between serious climate scientists and coders ( http://ca.youtube.com/watch?v=yW-wkCtdLZM& ) and other references to the information that the empically derived properties are 10’s of orders of magnitude different than the IPCC’s assumptions by backward calculating with GIGO ‘creative accounting’ statistical models to some silly number. Now you’ve just made my intellectual life almost too easy 😀
http://www.tech-know.eu/uploads/Falsification_of_the_Atmospheric_CO2_Greenhouse_Effects.pdf
A link worth repeating.

Ozzie John
January 24, 2009 1:38 pm

—–
The state of the Sun and its magnetic field is interesting, but not alarming. It may lead to a grand minimum, but the change in Total Solar Irradiance is small so don’t count on a solar driven little ice age.
—–
It’s interesting to see what people contemplate whilst fishing in the Florida keys, but perhaps during the next fishing trip CaptDallas2 should also contemplate what evidence we have regarding the TSI during the Maunder minimum since the SORCE satellite was not in orbit during the 1600’s. It may be that solar irrandiance remains high during a grand minimum but the only way to rule this out is through measurement of such an event. Using data from 2003 as guide makes for a big assumption.

Robert Bateman
January 24, 2009 1:39 pm

My greenhouse is attached to my house. The last 2 years it has served us well as the growing season starts later. It certainly is NOT outside.

bradley13
January 24, 2009 1:42 pm

I think re-engineering the code would be a huge service.
If you will allow a small observation from a gray-hair:: this is the type of application for which object-oriented languages are really poorly suited. Java is a great language, and so is C++. You don’t need objects here – in fact, you will have to work around the OO aspects of the language in order to get a decent implementation.
For this sort of problem, Fortran is a good choice. Even better would be a functional language like Lisp or ML.
I teach part-time and I always tell my students: a programming language is a tool. Would you hire a mechanic who only owns a screwdriver? It may be a really nice screwdriver, but sometimes you need a wrench. Any really good programmer will know several completely different programming languages, and know when each one is appropriate.

pyromancer76
January 24, 2009 1:58 pm

Ed Scott:
Thanks for the reference to “The Atmospheric CO2 Greenhouse Effects Within the Frame of Physics”. In contrast to Dave L (and Adam Gallon), I think this physics is essential for most of us who read WUWT, but I can only appreciate it from a non-scientist perspective. In particular, the paper suggests that the idea of the atmosphere acting “as a heat pump driven by an environment that is radiatively interacting with but radiatively equilibrated to the atmospheric system”, i.e., “the green house effect”, goes back to 1824. From the perspective of the historian, 1824 to the present has given us many years for a mistaken idea to permeate many levels of society. To a gardner a greenhouse is a lovely, intuitive idea for our beautiful earth and its atmosphere.
As a constant and grateful reader of WUWT, I would appreciate further information about the accuracy of the physics. The paper emphasizes “that until today [2007] the ‘atmospheric greenhouse effect’ does not appear: in any fundamental work of thermodynamics; in any fundamental work of physical kinetics; in any fundamental work of radiation theory” (p. 44) If the atmosphere were a greenhouse, then we would need a glass-wall effect rather than gravity, wouldn’t we? Is CO2 imagined as this glass wall, and is this imagination part of what has given the idea its power? If so, more science education for all our citizens!
And to Ric Werme, “that October 2005 drop is no big deal….” Is it no big deal because it does not provide a possible indication of significant cooling soon?

Pamela Gray
January 24, 2009 1:59 pm
Daniel M
January 24, 2009 2:38 pm

Adam Gallon (10:22:10) :
A mountain of male bovine excrement there old chap.
Few scientists would argue that the greenhouse effect doesn’t exist, the debate is to what extent AGW exists.
You obviously did not read the paper…
The claim is that the analogy comparing the atmospheric “effect” to a greenhouse is invalid, a “myth”. There is no claim that global temperatures are unaffected by the atmosphere. You might call this semantics, but the “debate”, or lack thereof, is being driven by language, and BIG decisions are being made by politicians and a populace largely ignorant of the actual science involved.
This poor analogy is at the very root of what is wrong with the current AGW argument.

Richard M
January 24, 2009 2:55 pm

sdk (09:31:49) :
“my vote would be to use C, and a procedural approach ! the results would be quite readable and maintainable, if coded from pseudo code viewed that way.”
I believe there is an automated tool for converting Fortran to C. That might be a good start and then convert that code to Java.
I also like the idea of making this available as open source so the entire programming community can have at it.

Richard M
January 24, 2009 2:58 pm

As I mentioned above for automated converstion see:
http://en.wikipedia.org/wiki/F2c

Joel Black
January 24, 2009 3:26 pm

Sorry this is OT, but I didn’t see another way to get in touch with you.
This is the address of a newly published commentary in the APS Forum on Physics and Society written by Robert E. Levine. In it, he decries the advocacy implied in the APS’s statement on AGW that was recently the focus of the uproar over their publication of Monckton’s letter. He asserts that the APS has abandoned the “openness principles” of scientific inquiry by issuing a supporting statement for AGW without promoting the continuing research that will improve our understanding of the governing forces of our climate.
http://www.aps.org/units/fps/newsletters/200901/levine.cfm

gary gulrud
January 24, 2009 3:31 pm

“The Gerhard Gerlich and Ralf D. Tscheuschner paper summary should be read by all”
Unfortunately, those unprepared to do so are swayed by braggadocio and hollow pretension.

cal
January 24, 2009 3:45 pm

Ed 13:04:57 asks “where is the greenhouse effect”
I think there is no doubt that certain gases like water vapour and carbon dioxide absorb long wavelength radiation and that this keeps the world warmer than it would be if they were not present. The problem is that this has been called a “greenhouse effect”. We were told in school that the glass of the greenhouse lets UV light in but does not let infra red radiation out. As a result of this radiation imbalance the temperature rises until it is sufficient to support losses from the outer surface of the glass to balance the incoming radiation energy. Whilst this schoolboy explanation has some validity it is pretty obvious that the main reason that a greenhouse warms is not the radiation imbalance but the prevention of convection. For example greenhouses have been made from plastic that is equally transparent to UV and infra red radiation (they work perfectly) and one only has to open a roof vent and most of the “greenhouse effect” goes.
I don’t think we should get too hung up about the descriptor “greenhouse gas” although I do regret the fact that it makes the average punter think he knows what is going on and has a “picture” in his mind which is miles away from reality.
Water vapour, and to a lesser extent CO2, methane and other molecules with dipole energies in the infra red region really do affect the radiation balance. There is a good description in Wikipedia but this is my understanding of how it works.
The atmosphere is mainly composed of nitrogen and oxygen and these are transparent to infra red radiation. If these were the only gases in the atmosphere the surface would radiate directly into space. In this scenario the incoming UV energy would be balanced by the outgoing infrared radiation even if the surface temperature was well below zero. The effect of the “greenhouse” gases is to absorb infra red radiation of certain specific wavelengths. Water vapour absorbs relatively weakly but has a large number of absorption bands whilst CO2 absorbs strongly but mainly in one 13 to 18 micron band.
Radiation from the surface, within these bands, is absorbed by the atmosphere within a few hundred feet. The absorbing molecules then reradiate the energy in all directions. Some of this is downwards and this adds to the UV flux from the sun and warms the earth. The rest is radiated upwards and this is then reabsorbed and reradiated until it gets to a point in the atmosphere when any photon radiated upwards is unlikely to be absorbed and so it is radiated into space. If one increases the concentration of the absorbing gases the altitude of this final radiation into space increases. As temperature decreases with height this reduces the temperature of the radiating layer. Because this layer is colder it radiates less and therefore the surface has to increase in temperature to ensure increased radiation at the other wavelengths to maintain the energy balance.
This is not to say that increasing CO2 will automatically cause global warming. The problem is that the CO2 effect is very small. A doubling of CO2 on its own is only thought to cause a 1 degree F change in global temperatures. To achieve the dire predictions quoted by the scaremongers one has to postulate positive feedback mechanisms such as reduced albedo through melting ice and further radiation forcing through increased water vapour. At the same time one has to ignore the possibility of negative feedbacks such as increased convection losses, decreased water vapour through increased precipitation and higher albedo due to increased cloud.
So the effect is real even if the descriptor is misleading. But the magnitude of the effect is small until proven otherwise. At one time the AGWs argued that there was no other explanation for the recent “unprecedented” rises in temperature and this was proof enough for them of the positive feedback mechanisms. But now the temperature has gone down again and there are a plethora of alternative theories as to why the climate changes.
The onus is on them to really prove their theories.

Psi
January 24, 2009 4:02 pm

nobwainer (Geoff Sharp) (05:00:07) :
In nearly 12 months we hit the spot where the solar system exerts its maximum disturbing influence on angular momentum on the Sun, its a strange situation. Normally it would be expecting a very low angular momentum count, but there are 2 planets ganging up on Saturn that have other ideas…and it happens on a regular basis on avg every 172 yrs, we have records showing this for at least 6000 years. This also lines up with grand minima nearly every time for the same period.
Here’s a graph showing that disturbance….follow the green arrow at 2010.
http://landscheidt.auditblogs.com/files/2008/12/sunssbam1620to2180gs1.jpg
Geoff,
Based on your tracking of this phenomenon, are you willing to make a global temperature forcast for the next 1-5 years?

Squidly
January 24, 2009 4:18 pm

@ bradley13 (13:42:10) :

I think re-engineering the code would be a huge service.

I would agree with you that re-engineering would be an undertaking, but after perusing through their code during my 1st iteration, I suspect that approaching this as a project in any manner would be a great undertaking, or to quote “be a huge service”. This is no trivial task.

If you will allow a small observation from a gray-hair:: this is the type of application for which object-oriented languages are really poorly suited. Java is a great language, and so is C++. You don’t need objects here – in fact, you will have to work around the OO aspects of the language in order to get a decent implementation.

As a fellow “gray-hair” with more than 27 years as a computer scientist, professionally active in real world problem solving, I would have to disagree. I believe the modeling of complex chaotic systems is precisely an object oriented problem. I have spent many years in the past developing software for modeling complex human behaviors in a quantitative fashion. I have also developed software for the DOD and DHS, again modeling rather complex systems in a large variety of environments (cannot really detail). Modeling these kinds of complex relationships are not nearly as easy to represent in procedural languages. This is the reason why we have developed object oriented languages. The more complex the system relationships, the more useful object oriented languages become. This is really Computer Science 101.

For this sort of problem, Fortran is a good choice. Even better would be a functional language like Lisp or ML.

Fortran was a good technology for mathematical computation as its primary focus is directed towards mathematics in syntax, but as far as mathematical computation in itself, there is no computational performance advantage over many other languages such as C or C++ or many other similarly compiled languages. As for LISP or ML, I cannot possibly imagine trying to tackle such a project using these tools. Fortran to LISP is quite a sharp contrast. I have had rather extensive experience (way back) developing applications with both languages, and they have absolutely no correlation to each other what so ever. They have vastly different fundamental purposes, and in my opinion, neither are well suited for modeling complex chaotic systems. I will cite exception to this in one respect. LISP would be well suited to describing the model structure and relationships themselves, as that is what it is intended, but very poor in its computational effectiveness. Since modeling climate would require crunching huge volumes of data, I believe LISP would be an extremely poor candidate overall as it is designed for managing set theory algorithms and primarily List Processing.

I teach part-time and I always tell my students: a programming language is a tool. Would you hire a mechanic who only owns a screwdriver? It may be a really nice screwdriver, but sometimes you need a wrench. Any really good programmer will know several completely different programming languages, and know when each one is appropriate.

May I ask what it is you teach? Your prior paragraph is quite shocking to me if I am to consider you an educator. Perhaps this explains why I learned far more about computer “tools” and problem solving in my professional career than I ever did in my collegiate career. I have been fluent in more than 40 programming languages and currently actively developing using at least 12 of those. Languages ranging from Assembler programming on PC’s and IBM mainframes, to Java J2EE distributive systems on Intel and AS400 platforms. I know the value of having a robust toolbox, knowledge-base and experience to draw from. I have spent a good portion of my career successfully guiding companies on their own choices of development environments, from languages to operating systems to hardware platforms, always goal oriented and overall subject specific. I am definitely not one of those preconception geeks that only adopts the fashionable technology (ie: “Windows sux .. Linux is all”), rather quite the contrary, each technology has its advantage or disadvantage and each in turn has its place. Throughout my career I have also been a very outspoken opponent to such practice and I have even forfeited some rather large contract opportunities because of my adherence to fundamental principals instead of adopting the industry buzzword of the day.
I am in the belief that we are currently experience something quite interesting in the industry of modeling in general, especially and specifically when talking climate modeling. It appears to me that climate modelers, such as Hansen’s group, continue to throw huge amounts of money and hardware at their problems, but are neglecting to adopt more modern development languages and techniques. I believe this is in large part due to the fact that they are government operated. I worked for the government for a period of time, and I know how they operate, and I can say from a developers perspective, it was a somewhat sad experience in this regard. I utilize far greater technology on my own home projects than they do on some of the multi-million dollar projects I worked on at the DOD. That’s pretty sad.
The other problem with their development efforts is that they usually span such an inordinate amount of time. Take for instance these climate models. This isn’t new code. They have been writing, rewriting, revamping, modifying, and basically “cowboy coding” this stuff for years. Just take a look at the code and you can clearly see this. My experience has shown me that whenever an organization finds themselves in this position with a project, they are eternally bound to the technology in use. They have not the monetary nor human resources to do otherwise. The programmers are unlikely to press for anything else, as they pay their 40 hours and collect their checks. To them, having a mess is good job security in a job that usually supplies a respectable salary, great benefits in a pretty laid back and easy going environment. Not a bad position to be in if you can get it (speaking from experience). The people that head these projects, such as Hansen, don’t actually know technology all that well. No, they don’t! Again, speaking from experience. They only know the buzzwords, they don’t actually understand the intricacies of implementations nor the pros/cons of the specific technology implementations themselves.
In closing, I would agree that Fortran could still work for these models, although not ideal in my judgment. LISP, LM or any similarly narrowly focussed languages are out of the question and I am still shocked at the suggestion. C might be alright, but if you are going to go that way, you would be better off leveraging the power of OOP with C++. Java has the advantage of portability and the possibility (as stated prior) to easily couple to a web interface and expose to the general public, but has the disadvantage of a performance trade-off. Additionally, and perhaps most importantly, if launching such a project will utilize multiple programmers, OOP lends itself to cooperative development efforts much more easily than procedural languages do. The idea is to build a lot of little “black boxes”, build the relationship connectors and simply plug them together. Each individual (or sometimes teams) is responsible for her/his box and the function it provides. Others need not know what is inside the box specifically, only what the box requires as input, and in what form it will spit the output. If this doesn’t scream climate model, I guess I am completely missing the problem at hand.

Squidly
January 24, 2009 4:25 pm

@ Richard M (14:55:24) :

I believe there is an automated tool for converting Fortran to C. That might be a good start and then convert that code to Java.

Probably several, and his sounds like a plausible idea. One might even be able to leverage such a tool to fundamentally clean the code by replacing some of the identifiers with more meaningful ones. Could be a very good place to begin. Great suggestion!

January 24, 2009 4:25 pm

pochas (06:20:03) :
nobwainer (5:00:07)
“In nearly 12 months we hit the spot where the solar system exerts its maximum disturbing influence…”
When will this influence become apparent? Instantly? What is your vision of the process taking place on/in he sun that generates sun spots? Are there no time lags involved?
Its totally apparent now, its a gradual process as the alignments move into phase. The process taking place is largely unknown, but there is some evidence that the change in angular momentum changes the spin rate of the Sun. This could be internal or overall, but the differential rotation speed is varied. Ian Wilson and Javaraiah have studies on this phenomena. Angular momentum is not a start/stop force, but more a gradual rise and fall. Having said that, we still need to consider inertia and how situations like after SC20, show how a big slow down can affect the next cycles. I think SC21 and 22 activity would have been even higher as the angular momentum was very high at the time.
My graph shows this here http://landscheidt.auditblogs.com/files/2008/12/ultimate_graph2.jpg

CodeTech
January 24, 2009 4:32 pm

Meh – I tried using the Fortran to C converter on the GCM (model E) and discovered it’s essentially useless. The amount of manual effort required to make that thing anything like useable is far greater than that required to simply re-write it… but then again, that’s a horrible piece of software.
What really scares me is the thought that Nasa is operating their space program with similar quality of programming… no wonder the occasional Mars lander vanishes, and why do satellites and space labs occasionally fall out of orbit?

Squidly
January 24, 2009 4:48 pm

@ CodeTech (16:32:25) :
Hahah, somehow, this doesn’t surprise me. I was rather astonished at the poor quality of the code myself.
This may require quite a bit of hand picking to accomplish. Perhaps a method of divide and conquer may be necessary. Identify blocks of related coding, break those into modules, further identify components and break them out accordingly. Iterate through this process until you can identify and overall architectural structure, analyze and develop an object oriented architectural model, then translate each of the previously discovered components into their respective objects, develop the relationships and plug them together. Viola! … sounds easy huh? Probably not so much, but still perhaps worth a try.

Ron de Haan
January 24, 2009 4:48 pm

David Archibald’s AP prediction provide an indication that our solar system has to cope with a reduced protective magnetic shield for many months to come.
It will be interesting to observe the effects described by Nasif Nahle:
http://biocab.org/Cosmic_Rays_Graph.html
His findings provide the following conclusion(s):
“Without a doubt, the present variability in the tropospheric temperature of Earth is directly attributable to the instability of the intensity of ICR, which include He++ and H+ nucleons and electrons. The long and the short intervals match unexpectedly”.
His findings eliminate the AGW doctrine and provide support for the Svensmark Theory.
It also provides support for David Archibald’s predictions.
Now unleash the wolves.

January 24, 2009 5:06 pm

Squidly (16:25:40) :
@ Richard M (14:55:24) :
I believe there is an automated tool for converting Fortran to C. That might be a good start and then convert that code to Java.
Probably several, and his sounds like a plausible idea. One might even be able to leverage such a tool to fundamentally clean the code by replacing some of the identifiers with more meaningful ones. Could be a very good place to begin. Great suggestion!

Squidly : I also have about 27 years of experience as a software developer and software architect. I agree with your long post regarding languages and conclusion regarding C++. I would just add that C++ is multi paradigm and does not enforce object orientation per se, it can be used as “a better C”. But I would use it precisely for object orientation and I do that daily.
I have also long experience with Fortran and how to integrate it with C++. I can say with absolute certainty that trying to convert undocumented and messy Fortran to C using an automatic tool is the sure way to a useless result….
There are essentially two workable options:
1. Call , from C++, existing Fortran subroutines without touching the Fortran source code or
2. Redesign the code from scratch
If the existing code is non-trivial and not well understood, the second option will be the fastest.

Ron de Haan
January 24, 2009 5:34 pm

RK (09:55:21) :
“I think Al Gore wins no matter what. If this AGW theory turns out to be bogus (which seems to be), his notoriety surely will rise to rival Mr. Ponzi’s and outlives his mortal time on earth. Gorified science?”
RK, How is that possible.
The USA now has a President that made the following statement:
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology.” Barack Obama
And recently he anounced that he will communicate with all Americans directly via e-mail.
What could go wrong?

Squidly
January 24, 2009 5:46 pm

@ Carsten Arnholm, Norway (17:06:51) :

There are essentially two workable options:
1. Call , from C++, existing Fortran subroutines without touching the Fortran source code or
2. Redesign the code from scratch
If the existing code is non-trivial and not well understood, the second option will be the fastest.

I guess I would personally be more inclined to option #2, but perhaps in not yet knowing enough detail of the code we are working with here, this could be more that what it may seem on the surface?
I am hesitant to simply break out modules of Fortran and call them from another language, as that doesn’t appear to me to gain anything of value in the long run. If I were to involve myself in such a project, it would be to further develop the process and ultimately create a model that is more accessible by others, such as through a web interface. Hence my tendency to lean towards a Java/J2EE solution as lends itself nicely to developing such a web application. This of course would necessarily require a complete rebuild, and perhaps a tremendous amount of work.

TJ
January 24, 2009 5:48 pm

“so don’t count on a solar driven little ice age.”
I am not counting on one, I am waiting for the grand experiment to produce a result. I am not assuming I know the answer because I have looked at the output of models that can’t possibly have the skill to describe the next step in our climate. Tell you what though. If there is a little ice age, I don’t plan to spend a lot of time explaining how the planet got it wrong and the models are still right.

TJ
January 24, 2009 5:56 pm

BTW, it is -10F outside, we are looking at -20F for the second time in a little over a week and only the third time in my life. Based on my car thermometer, I think there is a different kind of UHI that even affects rural areas. Vehicle caused turbulence heating. I noticed, on these past two extremely cold nights, that my car thermometer dropped 5 degrees F when getting off a traveled highway onto my lightly traveled road, and when it was -20F, it dropped another 5 degrees when driving the 700 ft up my non traveled driveway.
I think that the cold air settles in layers, with the dense, -20F air near the ground, and the warmer, -10F air maybe 10 or 20 ft above. As trucks and cars drive by, the turbulence brings down the warmer air to the highway. The colder air stays settled near the ground, for instance, at my garage, which is far from the road and separated by a small wood.

Squidly
January 24, 2009 6:00 pm

@ TJ (17:48:40) :

I don’t plan to spend a lot of time explaining how the planet got it wrong and the models are still right.

Now that is one of the best ideas I have heard in a while! I wish the AGW proponent groups would heed that idea.

January 24, 2009 6:01 pm

captdallas2 (05:24:13) :
Since Lief has not shown up yet. The solar magnetic field flips regularly every 7.5 to 15 years. The last time the Earth’s field flipped was about 800 Kyears ago. The state of the Sun and its magnetic field is interesting, but not alarming. It may lead to a grand minimum, but the change in Total Solar Irradiance is small so don’t count on a solar driven little ice age.
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.

Squidly
January 24, 2009 6:07 pm

@ TJ (17:56:33) :
I have wondered these things similarly myself on occasion. At times, I can simply walk around my yard and feel a difference in temperature. This fact always makes me wonder how it is that we can assume we are able to measure ground surface temperatures with any discernable accuracy, especially over time as stations are moved from place to place, and most especially to the 1/10 or 1/100 of a degree. This seems to me to be impossible.

old construction worker
January 24, 2009 6:19 pm

Ed (a simple old carpenter) (13:04:57) :
‘And I can’t stop reading about this stuff, it’s an addiction. ………
So where’s the greenhouse? I’m confused’
Well old carpenter, I recommend the following link?
He does a nice job of explaining the so called ‘greenhouse effect’
http://wwwjunkscience.com/Greenhuose/

old construction worker
January 24, 2009 6:21 pm
old construction worker
January 24, 2009 6:24 pm

I need to clean my glasses
http://www.junkscience.com/Greenhouse

January 24, 2009 6:25 pm

Psi (16:02:49) :
Geoff,
Based on your tracking of this phenomenon, are you willing to make a global temperature forecast for the next 1-5 years?

I think David Archibald is on track. I can give you a trend for 1-5 yrs or I can give it for the next century. The next five yrs should definitely be cooler as we will most likely be in grand minima, if not, expect cooling like SC20. We should see a modest recovery around SC26 with continuing mild temperatures, like the early 1900’s. 2130 will see high sunspot activity greater than SC19 followed by similar cooling of SC20. After that I am not expecting grand minima for some time as the angles are weakening considerably (there is a slim chance around 2190) as we head into another MWP type era.
http://landscheidt.auditblogs.com/files/2008/12/ultimate_graph2all.jpg

Mr H
January 24, 2009 7:16 pm

I think that you guys are right about c/c++ being the best language of choice.
What about adding Cuda
http://www.nvidia.com/object/cuda_home.html
NVidias new way of using your graphics card to change your pc into a supercomputer with 128+ FPU processors.
Just a thought-
It would make a good facebook group. I’m sure that there are plenty of people willing to code/validate out there.

Jeff Alberts
January 24, 2009 7:22 pm

Pyromancer, it’s a poorly-named effect. It’s not a greenhouse at all, but someone named it that way and it stuck.

January 24, 2009 7:46 pm

To TJ:
We live in similar settings. My home is situated in an open, virtually treeless plot of 16 acres, and it is positioned 700 ft from a rural paved highway that is seldom traveled at night. My home is located 2 miles north of a small town of circa 1,600 people, and beyond that at another 14 miles to the east is a city of circa 110,000. Last week my wife and I were in the city for an evening event, and we departed for home about 10:00pm. It was a clear night as a cold front had passed through the previous day. The temperature reading on my truck instrument was 43F as we traveled on the Interstate in the city. It takes about 22-23 minutes to drive home. During the drive, the temperature reading gradually declined to 34F midway between the city and the small town in definite rural surroundings, then rose to 36 F when we entered the small town. At the gate leading to my home, it was again 34F, but by the time we parked near the house, it was 32F. I went into the house immediately and checked my remote temperature sensor located 200 ft away from my home (and 900 ft away from the rural highway), and it registered 31F. The city has an official weather reporting station for NOAA (it has been surveyed); the temperature recorded at 10:00pm in the city was 45F. (Who says UHI isn’t real?) As the highway leading to my home is poorly traveled at night (maybe 20 cars per hour at 10:00pm), the effect was not as noticeable as in your case. I have compared city and home temperatures for years; the maximum difference between the readings occurs during winter following recent passage of a cold front when the sky overhead clears and the dewpoint drops (and the heat is free to radiate).

Joseph
January 24, 2009 7:47 pm

Re: nobwainer (Geoff Sharp) (18:01:48)
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Geoff, how about our planet’s albedo rising due to increased cloud cover? Can you rule it out?

Roger Knights
January 24, 2009 8:05 pm

TJ (17:56:33) :
BTW, it is -10F outside, we are looking at -20F for the second time in a little over a week and only the third time in my life. Based on my car thermometer, I think there is a different kind of UHI that even affects rural areas. Vehicle caused turbulence heating. … I think that the cold air settles in layers, with the dense, -20F air near the ground, and the warmer, -10F air maybe 10 or 20 ft above. As trucks and cars drive by, the turbulence brings down the warmer air to the highway.

It would be interesting to test if this effect accounts for much of the increase in temperatures in industrialized countries. Could it be simply tested by, for instance, setting up two temperature station alongside each other, in an area as near a road as many NOAA stations are, shielding one of the pair from the wind, and seeing if and when their temperatures diverge. If divergence is found, additional test pairs should be set up.

January 24, 2009 8:05 pm

Joseph (19:47:28) :
Re: nobwainer (Geoff Sharp) (18:01:48)
If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Geoff, how about our planet’s albedo rising due to increased cloud cover? Can you rule it out?
Its very possible, but if so, a product of lower solar activity. Its not all about TSI.

Frank Perdicaro
January 24, 2009 9:19 pm

On the Fortran->C issue. The problem is not compiling the code to
another language, the problem is understanding what the code does.
Automatically generated C from Fortran would be harder to understand
than the original Fortran.
Like others I have been in the commercial software business for
decades and have used and taught dozens of languages. For portability I
suggest C, but the main problem is figuring out the algorithm. My
second choice for portability and speed would be object-oriented Perl,
a choice many will see as absurd, but it is FAST, easy to write, and
widely available. Heck, one could use FORTRAN, wrap it in a .COM
layer and express it many ways. The problem is documenting the algorithm.
I agree with the point about using a functional language. It is probably
the best approach, but has a serious drawback. So few people know
functional programming you can never convince anybody that the
program is valid, so nobody would trust the results. People would
still want the code in C.
Whatever language is used, visualization should be divorced from data
homogenization.
The code will execute quickly, whatever it is. There simply is not that
much data to crunch. A nice desk-side system today is a million times
faster than nice computer from a few decades ago. 16GB of RAM now
costs as much as a nice dinner with wine.
At SIGGRAPH this past August I spent time with the Google Earth
developers asking about an idea. I thought it would be great to have
a full-earth layer that showed inaccuracy in data recording for temperature.
No such GE layer exists, but it is possible to create one using a proxy,
or man-in-the-middle, approach.

hotrod
January 24, 2009 9:23 pm

The project vortex cars used to gather real time data for tornado genesis, are a good example of the vertical variability of the atmosphere. They had sensors placed to pull ambient air from well above the car to avoid local heating from the pavement and the cars engines.
http://en.wikipedia.org/wiki/File:NSSL_vehicles_on_Project_Vortex.jpg
http://www.windows.ucar.edu/tour/link=/earth/Atmosphere/tornado/vortex.html
I have also observed rapid localized micro weather effects while driving. In areas where you frequently see fog form, (very shallow depressions) it is not uncommon to see large temperature gradients over very short distances late at night and early in the morning when cold air has pooled in those depressions. This sort of micro weather was a major problem for folks producing plume distribution models for hazardous material incidents.
I was involved with the emergency response planning for the Rocky Flats facility (Plutonium processing facility) and they had multiple weather observation towers around the facility to track possible trajectory of any plume release. It was not at all uncommon for no two of the observation towers to record the same temperature and wind conditions.
Plume models were written that tried to accounte for the microweather caused by terrain effects that they had to deal with.
It was very difficult to achieve any reasonable ability to predict where or how fast a plume of toxic material would move off site in case of an incident.

Evaluation of Atmospheric Transport Models for Use in Phase II of the Historical Public Exposures Studies at the Rocky Flats Plant
Authors: Rood A.S.1; Killough G.G.2; Till J.E.3
Source: Risk Analysis, Volume 19, Number 4, August 1999 , pp. 559-576(18)
Publisher: Springer
summary:
Five atmospheric transport models were evaluated for use in Phase II of the Historical Public Exposures Studies at the Rocky Flats Plant. Models included a simple straight-line Gaussian plume model (ISCST2), several integrated puff models (RATCHET, TRIAD, and INPUFF2), and a complex terrain model (TRAC). Evaluations were based on how well model predictions compared with sulfur hexafluoride tracer measurements taken in the vicinity of Rocky Flats in February 1991. Twelve separate tracer experiments were conducted, each lasting 9 hr and measured at 140 samplers in arcs 8 and 16 km from the release point at Rocky Flats. Four modeling objectives were defined based on the endpoints of the overall study: (1) the unpaired maximum hourly average concentration, (2) paired time-averaged concentration, (3) unpaired time-averaged concentration, and (4) arc-integrated concentration. Performance measures were used to evaluate models and focused on the geometric mean and standard deviation of the predicted-to-observed ratio and the correlation coefficient between predicted and observed concentrations. No one model consistently outperformed the others in all modeling objectives and performance measures. About 75% of the maximum hourly concentration predictions were within a factor of 5 of the observations. About 64% of the paired and 80% of the unpaired time-averaged model predictions were within a factor of 5 of the observations. The overall performance of the RATCHET model was somewhat better than the other models. All models appeared to experience difficulty defining plume trajectories, which was attributed to the influence of multilayered flow initiated by terrain complexities and the diurnal flow patterns characteristic of the Colorado Front Range.

http://www.ofcm.gov/atd_dir/pdf/trac.pdf
Reed Hodgin was the primary developer of the TRAC code as I recall.
It might be interesting to hear what his observations are regarding atmospheric modeling?
Larry

anna v
January 24, 2009 9:45 pm

Ed (a simple old carpenter) (13:04:57) :
I would think this is basic stuff and the answers should be settled but there seems to be an argument about this simple concept.
I do not know if somebody has replied to your puzzlement.
The “misnomer” of the global “green house” effect is not disputed even by Real Climate followers. It is a mistaken analogy. The climate “green house” has nothing to do with the “tomatoes greenhouse”. It is a label that has been attached to the effect of various infrared absorbing gases, the most important of these being water vapor, in the atmosphere. The effect of these is real.
Empirically think of the desert, how dry it is and how cold it gets at night just because of this. Generally humid nights keep the heat in everywhere as anybody who has some curiosity about the world he lives in will know. That is the “greenhouse” effect in climate.
How this happens is another confusing story, but happens it does.

Philip_B
January 24, 2009 9:54 pm

If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
We don’t know. Nor do we know what caused the much larger climate shifts from inter-glacial to glacial phases, and back, of the (current) Ice Age.
Volcanoes are put forward as an explanation because they are the only natural forcing we know about that can effect the climate by that much. Although, no one can explain how volcanic eruptions can effect climate for the decades to centuries needed for the LIA and MWP , and the millenia to hundreds of thousands of years required for glacials/interglacials.
It’s certain that over century and longer scales there is a climate driver we (or at least mainstream climate science) know nothing about (Svensmark’s GCR theory is a good candidate).
Which means CO2 is only an important (to the extent it is important and that’s debateable) in the periods where the century-scale driver is quiessent, as was true in the 20th century. It remains to be seen if the same holds true for the 21st century.
So, while something has to have caused the LIA, MWP and glacial/interglacial cycles, I see no evidence it’s the energy output by the level of solar activity. If it is the sun, it has to be via some indirect effect such as albedo changes due to cloud cover.

Ross
January 24, 2009 10:16 pm

Thanks to E.M. Smith, squidly, Ed Scott, cal and others for their very interesting and informative posts.


Ron de Haan (17:34:48) :

The USA now has a President that made the following statement:
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology.” Barack Obama
And recently he anounced (sic) that he will communicate with all Americans directly via e-mail.
What could go wrong?

An excellent idea! A couple of days ago, shortly after President Obama made that statement, I sent the following to him:

President Obama,
Congratulations on your historic inauguration. I wish you great success in your administration. The great American experiment in civilization, begun over two centuries ago, continues.
It is our obligation to make sure, as best we can, that experiment succeeds and flourishes
Mr. President, you recently made public statements regarding the openness of government information and how you would try to reinforce this policy.
In this regard I would like to recommend that – perhaps through the mechanism of Executive Order – you require that all scientists and researchers whose work is supported in any way by public funds, on the publication of their theories, conjectures or findings, make publicly available through, electronic or other suitable media, all relevant data, methods, procedures, etc. that may be required for their work to be reconstructed and then reproduced or verified/falsified by interested parties.
In the interest of true scientific advancement, all published scientific theories and conjectures must be subject to review – especially when financed by public funds. If not subjected to review and verification, such research becomes no better than “magic” and is equivalent to “Trust me, I’m right!”
When potentially very expensive public policy may be made based on the recommendations and opinions of the scientific community, those recommendations/opinions should be subject to open review.
The above suggestion would, of course, not apply to matters related to national security.
Thank you for your attention,
[esignature]

The email address is comments@whitehouse.gov
[sarcasm on]
I’m sure he would be more than pleased to hear from the rest you should you feel so inclined.
[sarcasm off]
I don’t really expect anything to come of it, but if enough others also wrote … well, who knows?

DR
January 24, 2009 10:43 pm

@ anna v
Confusing it may be, but the propaganda is wide and deep.
NCAR and a host of other “mainstream” institutions equate the ‘greenhouse effect’ to a real glass greenhouse.
http://www.ucar.edu/learn/1_3_1.htm
There is no shortage of this .
As to its effect, one has to wonder why the desert does in fact get very hot in the daytime with very little water vapor, but in the tropics temperatures do not appear to exceed a maximum threshold with excessive water vapor. Perhaps Miskolczi’s paper holds more more weight than his detractors care to acknowledge, and this is why NASA refused to publish his work.

January 24, 2009 10:55 pm

Ross (22:16:15) :
In the interest of true scientific advancement, all published scientific theories and conjectures must be subject to review
Review by whom?

January 24, 2009 11:03 pm

Douglas DC (08:27:36) :
Amazing !!!!!

January 24, 2009 11:44 pm

Carsten Arnholm, Norway (17:06:51) :
Squidly (16:25:40) :
@ Richard M (14:55:24) :
Fractals is the newest hot math to describe complex systems.
I am sure this just blew over heads LOL
oh well
FORTRAN is not a controlled language, C++ might be the way to go.

Robert Bateman
January 25, 2009 1:26 am

If the last cool periods (LIA) weren’t driven by the Sun, what was the driving factor, and don’t tell me volcanoes.
Probably had to do with the lack of solar activity and the lineup of things that it leads to.
The Big Ice Ages got so cold that the CO2 (antifreeze) went out of the atmosphere into sequestration. They finally ended when sufficient extended solar activity warmed the oceans enough to release the CO2 antifreeze to keep the Ice Sheets from reforming during solar minima.
The Earth wants to play freeze out due to want of solar activity dropping off.
The Interglacials are really brief compared to the Ice Ages.
12,000 yrs out of 100,000 is a sharp reminder, and we see what a Modern Maximum can do to glaciers and how few and far between they are even in this interglacial.
What is being thrown out with the bathwater is effect of solar minimums.

Ron de Haan
January 25, 2009 1:55 am

Looking for the switch that caused Glaciation within a period of 1 year?
http://1965gunner.blogspot.com/2008/08/last-ice-age-happened-in-less-than-year.html

January 25, 2009 2:33 am

I am a C#/SQL programmer.
Frank Perdicaro (21:19:39) :

I agree with the point about using a functional language. It is probably the best approach, but has a serious drawback. So few people know functional programming you can never convince anybody that the program is valid, so nobody would trust the results. People would still want the code in C.

C# 3.x incorporates many ideas from functional languages: . Maybe this is the “best of both worlds.”? (Now I’ve gone and done it – here come the foaming-at-the-mouth “MS is the anti-christ” hordes…)
Squidly (16:18:54) :

Modeling these kinds of complex relationships are not nearly as easy to represent in procedural languages. This is the reason why we have developed object oriented languages. The more complex the system relationships, the more useful object oriented languages become. This is really Computer Science 101.

Squidly, I am not grey-haired yet (just a tinge at temples), but I have been around long enough to participate in the pursuit of the programmer’s “holy grail”, OOPL’s, and even though I think you state a popular opinion among many non-academic programmers today, I would have to disagree with your statement. I took C++ in college, not COBOL, but I could see that Structural Programming already had many of the key elements OOP boosters want to claim as exclusive to OOPLs (for the non CS-geeks: OOPL=Object Oriented Programming Language).
Paul Graham started a little company that used Lisp to gain a competitive advantage:
http://www.paulgraham.com/avg.html. It was bought out by Yahoo and he is now a venture capitalist. He (and others) are not impressed by claims of superiority for OOPLs: http://www.paulgraham.com/noop.html. (BTW: I would highly recommend Paul’s essays on a variety of subjects – I really enjoy his writing style, even if I don’t always agree with his opinions).
I use C# and there are many very nice aspects to it that I like a lot. However, I think the idea that it is just CS101 that complex problems must always be implement in an OOPL is just not correct. I am also a SQL programmer and I enjoy the declarative nature of SQL far more than writing loops in C#. SQL is far more readable and efficient for many types of complex problems (far more, in fact, than many OO programmers appear to realize since they often write complex, abstruse code in a C# client or Web Service that should be implemented as SQL queries IMHO).
And you thought AGW was contentious! Just look what a couple of simple suggestions for the “proper” language to use for a simple project can do when uttered in the presence of a few CS-geeks! (We went through the “language wars” a couple of years ago in our organization: Java vs C# – talk about stress!).

TerryS
January 25, 2009 3:08 am

Re: Philip McDaniel (11:21:56)

Try Visual Basic…if Microsoft hasn’t changed it beyond all recognition. I used it several years ago to build some programs that analyzed Diesel engine performance. The language allows inclusion of Active X controls, of which there were many build by independent programmers and companies.

You forget that not everyone is prepared to (or has the money to) pay for a windows license and a visual basic license. The inclusion of active X controls will also result in many people not being able to view the results correctly.
If the project ever gets of the ground then it should use a language that is well defined and available on multiple platforms.

braddles
January 25, 2009 3:32 am

Will people talking about computer languages and other irrelevancies go away please? This is a thread about solar activity.
Best Science blog it may be, but this blog needs better moderation. Maybe a chat room on the side.

January 25, 2009 3:56 am

I think David Archibald is on track. I can give you a trend for 1-5 yrs or I can give it for the next century. The next five yrs should definitely be cooler as we will most likely be in grand minima, if not, expect cooling like SC20.
Was there cooling during SC20? The mid-20th century cooling began in the 1940s. SC20 didn’t start until 1964 – around 20 years later. The end of SC20 (in 1976) actually signalled the start of the modern warming era.
Like a lot of the solar theory stuff – things just don’t add up.

January 25, 2009 4:03 am

It’s quite interesting this magnetic stuff I think! I’ve been tracking the ‘aa’ Index for the past few years. I keep an updated database on the Monthly value, which is from the International Geomagnetic Indices database: http://isgi.cetp.ipsl.fr/lesdonne.htm
The ‘aa’ Index posted a value of ‘9’ in November. December was back to ’12’ again. The November Monthly value ties with June of 1954 which also posted a ‘9’ Before that we have to go back to 1936 in which September also posted a ‘9’. So we are still joint bottom! To find a value below ‘9’ we need to go back further to November of 1927 which posts ‘8’.
Now, I’m sure Leif disputes the early database of the ‘aa’ Index. I’m sure he mentioned he thought it was underestimated. Might have remembered this wrong. But if so, then November 2008 might be more unusual that even before 1927.
Now, may be we should be looking at Annual averages here. After all it is The Sun. If we look at this, then 2008 as an Annual figure ties once again with 1965, both posting “14” Actually, 1965 was 13.75 and 2008 14.08, but they round to the same Integer, as I don’t believe we use decimals for Magnetic Indices. But could be wrong on that!
Previous to that, the years 1923 and 1924 both post Annual Averages of ’10’. So a long way back whatever.
Here’s a few plots anyway. Monthly plot since 1950, showing nicely the tick-down in 2005 as already shown on the Ap Index:
http://www.wacv.co.uk/charts/climate/solar_image.php?SelectSeries=aa&SelectTemp=None&SelectRes=Monthly&StartYear=1950&EndYear=2008
Annual since 1900, showing the Low level reached in 2008 since at least the 1920’s (and possibly further back if you don’t trust this earlier data)
http://www.wacv.co.uk/charts/climate/solar_image.php?SelectSeries=aa&SelectTemp=None&SelectRes=Annual&StartYear=1900&EndYear=2008
You can explore the ‘aa’ database at the following link. Select “Solar Series” and then ‘aa’ from the pull down and choose Monthly/Annual and the date range.
http://www.wacv.co.uk/index.php?option=com_wrapper&view=wrapper&Itemid=23

Nick Yates
January 25, 2009 4:54 am

Further to the ‘what programming language is best’ debate. I think it’s pretty pointless trying to re-write the GISS code because you’d never be able to prove that your re-write was functionally the same as the original. I’d be far more interested in someone doing some thorough black box testing on the code as it is, to find out if it actually works correctly. If you could prove that the code always produces results with a warm bias, then you’d really put GISS on the spot.

January 25, 2009 5:00 am

Philip_B (21:54:21) :
We don’t know. Nor do we know what caused the much larger climate shifts from inter-glacial to glacial phases, and back, of the (current) Ice Age.
Is that right?…I thought the IPCC was quite comfortable with the Sun as a driver before 1950. Milankovitch cycles every 100,000 years very clearly explain our ice age cycles…its a simple thing, earth’s orbit moves from round to elliptical.
Volcanoes are put forward as an explanation because they are the only natural forcing we know about that can effect the climate by that much.
So we know everything and there cant be anything else? But once again you forget the Sun and the acceptance of solar forcing before 1950. The ENSO pattern might just be another, and then we have the earth’s albedo and dont forget the emerging area of UV.
Although, no one can explain how volcanic eruptions can effect climate for the decades to centuries needed for the LIA and MWP , and the millenia to hundreds of thousands of years required for glacials/interglacials.
I think you need to provide some links showing us how volcanoes have anything to do with past ice ages. Mt Tambora was one of the biggest eruptions in recent times during the Dalton. It was reported to have a very generous SO2 content. Ice core records only show a period of 2-3 yrs total time in the atmosphere, that is a long way short of your statement.
http://en.wikipedia.org/wiki/File:Greenland_sulfate.png
All other volcanoes including Krakatoa since the late 1800’s show very minimal to nil change in the world GISS records. Volcanoes have a very short term effect on climate.
So, while something has to have caused the LIA, MWP and glacial/interglacial cycles
These arguments are often heard from AGW supporters who discredit the Sun and hold volcanoes up as an answer…. heard it several times before in here.

TerryS
January 25, 2009 5:29 am

braddles (03:32:15) :
Will people talking about computer languages and other irrelevancies go away please? This is a thread about solar activity.

if [ $topic != “solar” ]
then
  discard($topic)
else
  accept($topic)
fi
🙂

January 25, 2009 5:34 am

Ah the things you think about while fishing. The current state of the sun whether it leads to a grand minima or not should lead to a more stimulating and informed debate on climate. Mathematically, the variation in TSI is not sufficient to cause a little ice age on its own. However, reduced TSI with a cold PDO and potentially a cold AMO could produce significant cooling.
The current La Nina event appears to be driven by the PDO reversal. This can lead to 25 to 30 years of stable or slightly cooling global temperature averages. If the reduced TSI continues during the cool PDO, there is a greater possibility of cooling. (My opinion not Lief’s) Once a true trend can be identified, it will force quite a few well educated fellows to rethink their positions.
The one person I feel has the best grasp of the situation is A.A. Tsonis. He has one particularly interesting paper dealing with the synchronization of weather oscillations. His math is well over my head, but produced some interesting and very logical conclusions on natural climate variability. A paper I wish more people on both sides of the debate would read. Then maybe we can get to the real question that needs to be answered. What is the Earth’s climate sensitivity?
When scientists take two estimates of climate sensitivity and divide by two to get an agreeable number I felt politics instead of science was driving climate science. CO2 does behave as a greenhouse gas. By itself, a doubling of CO2 can increase retained radiation by 1 W/M2 plus or minus a quarter Watt roughly (about 0.75 degrees C of warming). That is a lot less than the Hanson compromised 3 degrees C. The question is the ratio of water vapor feedback/forcing. According to Tsonis’ results, it is much less than the Hanson et al estimates so brilliantly deduced from outdated but well cited papers, such as Lean 2000 for example.
Tamino, a well educated and normally rational thinker, finds Lean 2000 “plausible” despite numerous other papers that greatly reduce the TSI impact suggested in her original paper. No offense to Dr. Lean, hers was a good first attempt, but science does occasionally make progress. Climate science will not make progress until well educated and normally rational thinkers regain their natural curiosity and skeptism.
The next decade or so should force them to regain their natural curiosity.
Then what do I know? I am just a fisherman.

Editor
January 25, 2009 5:59 am

braddles (03:32:15) :

Will people talking about computer languages and other irrelevancies go away please? This is a thread about solar activity.
Best Science blog it may be, but this blog needs better moderation. Maybe a chat room on the side.

This is only the second thread “hijacked” by the programmers, and it’s a good thread for it since there doesn’t seem to be much follow up on Archibald’s prediction. I recommend you skip following this thread, there are plenty of others.
As for better moderation, you can bet the moderators at RealClimate would have shut this discussion down by now. (Some of them even work on ModelE.) Be careful of what you wish for.
I am rather bemused by some of the suggestions (object oriented Perl) (FORTRAN is not a controlled language), lack of appreciation for the amount of number crunching to be done, the focus on programming language rather then understanding the physics, etc.
I did write a longer post last night, but lost it and don’t want to take the time to rewrite it. I’ll just note that C++ fans might want to check out the Blitz++ library, see http://ubiety.uwaterloo.ca/~tveldhui/papers/iscope97/index.html

January 25, 2009 6:03 am

John Finn (03:56:57) :
Was there cooling during SC20? The mid-20th century cooling began in the 1940s. SC20 didn’t start until 1964 – around 20 years later. The end of SC20 (in 1976) actually signalled the start of the modern warming era.
Like a lot of the solar theory stuff – things just don’t add up.

The Sun is but one driver, we also had the PDO shift to negative around 1940 and there is some talk about the earlier nuclear tests also having some impact, I am not sure if atomic denotations affect climate, maybe someone has some data on this?
But you cant deny SC20 was a cool period, and once ended the temperatures moved up immediately as you noted.

Bill Illis
January 25, 2009 6:53 am

Here is Dr. Judith Lean talking about the solar cycle influence and global warming theory in 8 minutes (she is a very fast talker).
A couple of interesting points – she pulls ENSO, volcano, GHG, aerosols influences out to arrive at the solar cycle residual – And then she splits the atmosphere into the surface, middle troposphere and stratosphere in which she shows there is much bigger solar cycle influence in the stratosphere +/-0.3C versus the surface at +/-0.1C and the lower troposphere is also higher +/-0.2C. Something we will have to take into account when we talk about the solar cycle in the future.

Editor
January 25, 2009 7:37 am

Per my earlier posting, section labeled: NOAA & GISS:
A bit of errata:
At this point though, my ‘first blush’ is that NOAA has the false precision problem. They hand over ‘monthly mean’ data in 1/100 degree C precision. I don’t see how that is even remotely possible.
That ought to have been 1/100 degree F precision. Sorry… Staring at code too long can make you a bit fuzzy…
Speaking of which, I’ve done a detail pass of step0 and a moderate pass of step1, along with a cursory pass of steps 2, 3, 4, & 5.
The code style varies dramatically from step to step. My impression is that this is either tied to the ‘era’ when it was originally written or, more likely, to the individual who had that bit to write. Some of the style is a bit, er, puzzling (like every time you use a FORTRAN program, which is often, the source is recompiled inline to a binary, used, then the binary deleted…)
The overall flow is that step0 is a pre-process glued on after 1-n had been running for a while. It takes in the raw data from several sources and does a basic ‘stick the datasets together and toss clearly trash and duplicate records’. It also converts the US data to C to better merge with the world data.
The only questionable thing I’ve found so far is a conversion from F to C that takes F (in monthly means of 0.01 precision from data originally read with a 0.1 resolution and converted to whole digits of precision for reporting, meaning there are 2 digits of ‘false precision’. Everything after the decimal point) and converts it to C in tenths of a degree.
The part I’m not sure about is the precedence order of evaluation and the overall impact on final validity of the temp in C.tenths. The code is:
if(temp.gt.-99.00) itemp(m)=nint( 50.*(temp-32.)/9 ) ! F->.1C
Which looks for a ‘data missing flag’ of -99 and as long as valid data are in the field, converts the temp (a real data type in -xxx.x F format) into itemp (an integer data type in XXXX C format, with the last digit being tenths, for each month m [do loop counter is m 1,12] )
I learned FORTRAN under F77 rules long ago and don’t remember a ‘nint’ function (this code is F90), but I do remember that 9 is an int while 9. is a float. So we have -32.(float) and 50.*(float) but /9 int …
This all leaves me a bit unsure what the actual computation would be (with type conversions) and what the actual outcome would be. When I learned it, the type conversions were automatic (and often unplanned…) while ‘nint’ looks like a type conversion function wrapper. Any ideas welcome…
I’m not too worried, though, since the fractional part of the temperature in F is fictional anyway… (Well, I’m worried, just not about this particular bit of computation putting excessive error into the fractional digits, since they are already in doubt…)
After that, steps1 and 2 do the basic shake & bake of tossing old or odd records and ‘adjusting’ urban UHI to rural neighbors. (step 2 is the only python step – maybe a rewrite?) Then steps 3, 4, and 5 do the mapping to ‘zones’ and ‘grid boxes’ and conversion from temperatures to anomalies and then the averaging of averages begins in ernest. The anomaly data are once again changed to match what they ought to be, based on what their neighbors are doing. Then the grids, zones, hemispheres, etc. get repetitively averaged together to reach The One True Mean …
While it is too early to speak definitively (I’ve not got a detailed pass of the anomaly, grids, zones, hemispheres etc. code) my overall reaction to it is: By the time you have this many math steps with massaging, averaging, averaging averages, type conversions, precision conversions, neighbor adjustments, neighboring region adjustments, etc. how can you have any idea that the final number is representative of anything in the real world? Especially in the ‘tenths place’.
It takes me about a 2 days to get through a section, and the later ones look harder. Since I can only put a day or two on this per week, it will likely be several weeks to months before I reach an end.
To all the folks who suggested an open source rewrite: I think that would be a fine idea. A large quantity of this code could be dumped just by getting rid of all the recompile steps and all the data file shuffle that constantly happens (input file -> copy -> process -> new file -> copy …) for several files at each step.
Though the first thing, that I think would be more productive, would be to simply pick up the NOAA already UHI adjusted data (avoiding the Hansen method UHI shake & Bake), merge it with clean world data, and find an “unadulterated by the reference station method” trend.
Snippets from the gistemp.txt ‘readme’ type file:
Step 2 : Splitting into zonal sections and homogeneization […]
To speed up processing, Ts.txt is converted to a binary file and split
into 6 files, each covering a latitudinal zone of a width of 30 degrees.
The goal of the homogeneization effort is to avoid any impact (warming
or cooling) of the changing environment that some stations experienced
by changing the long term trend of any non-rural station to match the
long term trend of their rural neighbors, while retaining the short term
monthly and annual variations. If no such neighbors exist, the station is
completely dropped, if the rural records are shorter, part of the
non-rural record is dropped.

Look at the recent jet stream with air plunging down from Alaska and warm running up from the gulf and tell me that ‘latitudinal’ makes sense? I think there is a basic flaw here in assuming that air masses are uniform in a latitude on a long term basis (enough so to allow ‘adjustment’…)
Step 3 : Gridding and computation of zonal means (do_comb_step3.sh)
————————————————
A grid of 8000 grid boxes of equal area is used. Time series are changed
to series of anomalies. For each grid box, the stations within that grid
box and also any station within 1200km of the center of that box are
combined using the reference station method.

And after the stations have been history corrected to what’s ‘right’ for their latitude, their anomalies are further corrected based on their grid box neighbors, and whatever a ‘reference station method’ is… (I looked at that bit of code and it will take a bit more time to unravel…)
And then there are two more steps left to go after this…
So again, I’d use the NOAA data to ‘do the right thing’ for an audit check before I’d rewrite this into another language …

Editor
January 25, 2009 7:58 am

Harold Ambler (07:09:52) :
if you could leave a comment (”Hello” would be enough) on my weather and climate blog (http://www.talkingabouttheweather.com).

Done, under Enhanced Gore Effect…

Editor
January 25, 2009 8:26 am

squidly (10:15:43) :
sdk (09:31:49) : my vote would be to use C, and a procedural approach […]
If you are going down the C road, I would recommend C++ and write it in a modular / OOP architecture

Before everyone gets all worked up about a million line project… The actual code involved is fairly small and rather direct. Mostly, so far, it’s been taking table data and doing minor edits or simple math. Frankly, it’s the sort of thing best done with a database and simple report language. (Pick up set a, load fields, pick up set b, load fields from different order, etc. Dump file in yet another order with type conversion).
While this may change in later steps, its mostly the fragmented nature of it that is confusing. (4 scripts in a mix of sh and ksh feeding 1/2 dozen FORTRAN programs of about 1/4 to 1/2 page each, with 4 input files and 3 output files plus a dozen gratuitous intermediate files… just to join 3 similar data sets and turn F to C in one of them…) It’s all the format, variable declarations, file read/write, etc. that accomplish nothing that make it slow slogging. If it were rewritten with efficiency in mind, I think you are looking at about one programmer week, or less, whatever language they used (as long as they were familiar with it…)
Conceptually the process is trivial. Pick up NOAA data. Pick up Antarctic data, Pick up GHCN data. Convert to consistent degrees C or F. Toss broken data (missing or interpolated [flagged with M] and very old pre 1880, remove dupes and overlaps) then produce a merged data set. Plot trends.
Now Hansen has a bunch of boxing, gridding, etc steps with averaging averages of averages that you could modestly easily duplicate, but frankly I’d just plot the trends for each station and average those trends. If that trend is up over ‘a very long time’ we have warming. If not, we don’t. Further, I wouldn’t use the monthly mean, I’d do one trend with the monthly MAX data and another with the monthly MIN data. Now you can see if we have a warming trend, a trend to wider ranges, whatever.
My ‘next step’ is going to be a catalog of number of pieces and lines of code in each, just to make it clear that this isn’t some incredibly complex thing… Frankly, I was surprised at just how small (& limited) it is. It’s not a GSM, it’s a data massage on some single digit MB files…

Editor
January 25, 2009 8:48 am

Syl (11:01:17) :
E.M.Smith
Good work!

Thanks!
Probably not, unfortunately. NOAA does not actually make UHI adjustments to the data, but make an allowance in uncertainty instead, from what SteveM has been able to find.
I looked at the site you posted. Interesting but a lot of it is speculative.
At this point all I can say is that the NOAA download site claims they have a UHI adjusted data set. Investigating exactly what that means will have to fall to someone else, as I’m up to my eyeballs in code right now…
What I can say with certainty is that I’m not impressed with the ‘reference station method’ in GISS and in fact I suspect you would be better off doing no UHI adjustment and just looking at the raw result, THEN deciding if you needed to do something to fix it. (And I’d even prefer a ‘fudge number’ assigned to each station I.D. based on the site evals being done, to the semi random shake & bake in GISS …)

hotrod
January 25, 2009 9:03 am

For an open source re-coding of the various models to have true value, it should be in an “open language”.
By that I mean a computer language that is so ubiquitous that it is available to anyone of average means and can compile and run on just about any hardware. It should also be a language that has robust well understood libraries and functions.
To serve the purpose of being an “open source” independent review, you should ensure that a couple college students could run the code on a Linux Beowulf system built from stray lab computers, or on a multimillion dollar mainframe.
Even if you only re-code “black box” modules into an open source code, you have accomplished some useful work. To re-code, it you must understand the black box, which would at least lead to a paper annotating the original code and what it was doing, and perhaps pointing out some boundary conditions where the code block does silly things, such as resulting in undefined behavior according to the language specification, or questionable mathematical, and statistical procedures.
In that regard, I would be inclined to stay away from all proprietary languages that require high license costs or run on limited numbers of platforms, but to use languages like Perl, C, C++, java, SQL etc. which have a large base of experienced coders and compile on just about any hardware, as appropriate for the specific task at hand in that module. Build each black box using code that is most appropriate for the primary task of the module and then make sure its input and output have good error checking routines so it provides the proper input/output to the next black box.
Regarding the discussion of ice age triggers there have also been suggestions that extra terrestrial dust is a possible trigger. They suggest that periodically the earth passes through a “dusty region of space” which might have climate impacts.
http://www.astrobiology.com/news/viewpr.html?pid=20481
Although there is debate on this issue as well.
http://www.sciencemag.org/cgi/content/summary/280/5365/828b
The book Earth Under Fire discusses the possibility of cosmic dust falls being associated with climate and interestingly enough having a cycle that matches the solar cycle.
http://books.google.com/books?id=FRr231sanc4C&pg=PA125&lpg=PA125&dq=ice+age+interplantary+dust&source=web&ots=Gxh0rWkzih&sig=ixlwYxpzMpOtCzh0OHV1KRDUp7g&hl=en&sa=X&oi=book_result&resnum=5&ct=result#PPA121,M1
Larry

Editor
January 25, 2009 9:28 am

Carsten Arnholm, Norway (17:06:51) :
Squidly (16:25:40) :
@ Richard M (14:55:24) :
If the existing code is non-trivial and not well understood, the second option will be the fastest.

Before this turns into a 100 person with P.M/ MSProject controled pyramid building effort 😉 perhaps an example would be helpful…
Step0 has 7 FORTRAN programs and 4 wrapper scripts. The scripts have several sizes but many are about 1 page. Below I’m going to paste the entirety of one of the FORTRAN programs. It’s one of the larger ones, and without the context it’s not very understandable, but just look at the size. It isn’t like we’re simulating Alaska here… It’s just a fancy cut/past data filter…
integer itmp(12),itmp0(12)
real dif(12)
C**** replace GHCN (unit 2) by USHCN (unit 1) data; new file: unit 12
C**** assumption is that IDs are ordered numerically low->high in both
C**** input files
open(3,file=’ushcn-ghcn_offset_noFIL’,form=’formatted’)
open(1,file=’USHCN.v2.mean_noFIL’,form=’formatted’)
open(2,file=’v2.meany’,form=’formatted’)
open(12,file=’v2.meanz’,form=’formatted’) ! output
open(20,form=’formatted’,file=’GHCN.last_year’)
! CountryCode,ID,year,T-data
iyrmax=0
read(3,*) iddiff,dif
10 read(1,'(i3,i9,i4,12i5)’,end=100) icc0,id0,iyr0,itmp0 ! read USHCN
if(iddiff.lt.id0) read(3,*) iddiff,dif
20 read(2,'(i3,i9,i4,12i5)’,end=200) icc,id,iyr,itmp ! read GHCN
if(iyr.gt.iyrmax) iyrmax=iyr
if(id.lt.id0.or.icc.ne.425) then
C**** just copy non-USHCN station (incl. non-US stations)
write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp !
go to 20
end if
!!! if(id.gt.id0.or.iyr.gt.iyr0) stop ‘should not happen’
if(id.gt.id0.or.iyr.gt.iyr0) then
C**** id-GHCN>id-USHCN or same ID but extra years: merge in USHCN data
30 write (12,'(i3,i9.9,i4,12i5)’) icc0,id0,iyr0,itmp0
write(*,'(a6,i9,i5,12i5)’) ‘ushcn ‘,id0,iyr0,itmp0
read(1,'(i3,i9,i4,12i5)’,end=100) icc0,id0,iyr0,itmp0 ! USHCN
if(iddiff.lt.id0) read(3,*) iddiff,dif
if(id.lt.id0) then ! until USHCN overtakes
write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
go to 20
end if
if(id.gt.id0.or.iyr.gt.iyr0) go to 30
end if
! or catches up: id=id0
C*** skip early years not present in USHCN data
if(iyr.lt.iyr0) then
! write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
write(*,'(a11,i9.9,i5,12i5)’) ‘skip v2.mn ‘,id ,iyr ,itmp
go to 20
end if
C*** replace GHCN by USHCN data if present
if(iddiff.ne.id0) stop ‘wrong iddiff’
do m=1,12
if(itmp0(m).gt.-9000) itmp0(m)=itmp0(m)-dif(m)
end do
write (12,'(i3,i9.9,i4,12i5)’) icc0,id0,iyr0,itmp0
go to 10
c**** No more USHCN data – copy data for remaining non-USHCN stations
100 if(id.gt.id0.or.icc.ne.425)
* write(12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
110 read(2,'(i3,i9,i4,12i5)’,end=200) icc,id,iyr,itmp
if(iyr.gt.iyrmax) iyrmax=iyr
write(12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
go to 110
200 write(20,*) iyrmax
stop
end
integer itmp(12),itmp0(12)
real dif(12)
C**** replace GHCN (unit 2) by USHCN (unit 1) data; new file: unit 12
C**** assumption is that IDs are ordered numerically low->high in both
C**** input files
open(3,file=’ushcn-ghcn_offset_noFIL’,form=’formatted’)
open(1,file=’USHCN.v2.mean_noFIL’,form=’formatted’)
open(2,file=’v2.meany’,form=’formatted’)
open(12,file=’v2.meanz’,form=’formatted’) ! output
open(20,form=’formatted’,file=’GHCN.last_year’)
! CountryCode,ID,year,T-data
iyrmax=0
read(3,*) iddiff,dif
10 read(1,'(i3,i9,i4,12i5)’,end=100) icc0,id0,iyr0,itmp0 ! read USHCN
if(iddiff.lt.id0) read(3,*) iddiff,dif
20 read(2,'(i3,i9,i4,12i5)’,end=200) icc,id,iyr,itmp ! read GHCN
if(iyr.gt.iyrmax) iyrmax=iyr
if(id.lt.id0.or.icc.ne.425) then
C**** just copy non-USHCN station (incl. non-US stations)
write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp !
go to 20
end if
!!! if(id.gt.id0.or.iyr.gt.iyr0) stop ‘should not happen’
if(id.gt.id0.or.iyr.gt.iyr0) then
C**** id-GHCN>id-USHCN or same ID but extra years: merge in USHCN data
30 write (12,'(i3,i9.9,i4,12i5)’) icc0,id0,iyr0,itmp0
write(*,'(a6,i9,i5,12i5)’) ‘ushcn ‘,id0,iyr0,itmp0
read(1,'(i3,i9,i4,12i5)’,end=100) icc0,id0,iyr0,itmp0 ! USHCN
if(iddiff.lt.id0) read(3,*) iddiff,dif
if(id.lt.id0) then ! until USHCN overtakes
write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
go to 20
end if
if(id.gt.id0.or.iyr.gt.iyr0) go to 30
end if
! or catches up: id=id0
C*** skip early years not present in USHCN data
if(iyr.lt.iyr0) then
! write (12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
write(*,'(a11,i9.9,i5,12i5)’) ‘skip v2.mn ‘,id ,iyr ,itmp
go to 20
end if
C*** replace GHCN by USHCN data if present
if(iddiff.ne.id0) stop ‘wrong iddiff’
do m=1,12
if(itmp0(m).gt.-9000) itmp0(m)=itmp0(m)-dif(m)
end do
write (12,'(i3,i9.9,i4,12i5)’) icc0,id0,iyr0,itmp0
go to 10
c**** No more USHCN data – copy data for remaining non-USHCN stations
100 if(id.gt.id0.or.icc.ne.425)
* write(12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
110 read(2,'(i3,i9,i4,12i5)’,end=200) icc,id,iyr,itmp
if(iyr.gt.iyrmax) iyrmax=iyr
write(12,'(i3,i9.9,i4,12i5)’) icc,id,iyr,itmp
go to 110
200 write(20,*) iyrmax
stop
end

January 25, 2009 9:36 am

But you cant deny SC20 was a cool period, and once ended the temperatures moved up immediately as you noted.
….and you can’t deny that SC19 (the strongest cycle ever recorded) was also during a cool period.

Robert Bateman
January 25, 2009 10:00 am

Ron de Haan (01:55:25) :
Looking for the switch that caused Glaciation within a period of 1 year?
http://1965gunner.blogspot.com/2008/08/last-ice-age-happened-in-less-than-year.html

When it’s 40 below outside, your heater is on the opposite side of the room, you open the door to get some firewood, and the inside of your house by the door drops precipitously.
Just pop a hole in the R value of the atmosphere somewhere and plenty of heat will go rushing out. Like, say, the pole.

January 25, 2009 10:05 am

To get back on topic:
http://www.leif.org/research/Ap%201844-2009.pdf
Ap may go down to 4 in the next few months before heading back up. Note that Ap for Odd-Even cycle transitions [like right now] is typically 25% lower than for Even-Odd cycle transitions. This is a consequence of the geometry of the interaction between the solar wind and the Earth’s magnetosphere.

Robert Bateman
January 25, 2009 10:14 am

There are but a few sources of heat on Earth.
One is the gravitational collapse to form the body (leftover).
Another would be tidal forces acting on the planet.
The one that actually maintains the heat input to keep up the current equilibrium (the Sun).
What else is there?
Mess with the input, tinker with the R value of the atmosphere, shade the place with vulcanism and you get a new net value that has to find equilibruim.
My point is that the oscillations may drive the place to keep the equilbrium that it seeks, but they cannot change the net heat value of the planet.

hotrod
January 25, 2009 11:14 am

There are but a few sources of heat on Earth.
One is the gravitational collapse to form the body (leftover).
Another would be tidal forces acting on the planet.
The one that actually maintains the heat input to keep up the current equilibrium (the Sun).
What else is there?

Radioactive decay in the core, is the major one you have missed.
Heat input to the atmosphere from meteorite burn up (has anyone quantified this with even a back of the envelope estimate?).
Estimates for the mass of material that falls on Earth each year range from 100 to 1000 metric tons of meteorites per day. Most of this mass would come from dust-sized particles. If all that mass strikes the atmosphere at typical meteorite velocities of 20 km/sec that is an awful lot of Kinetic energy to be dissipated as heat!
Electrical currents induced in the core, mantle and atmosphere by the effects of solar wind and solar storms buffeting the magneto sphere.
Larry

Ross
January 25, 2009 12:23 pm

Leif Svalgaard (22:55:13) :
Ross (22:16:15) :
In the interest of true scientific advancement, all published scientific theories and conjectures must be subject to review
Review by whom?

Thanks for your feedback. In explanation, originally I had considered including phrases such as “… by qualified scientists” or “… by qualified individuals”, but felt that – if disclosure were ever mandated – it would give an out to those not wishing to disclose by claiming that any potential reviewer was “not qualified”.

hotrod
January 25, 2009 12:34 pm

Heat input to the atmosphere from meteorite burn up (has anyone quantified this with even a back of the envelope estimate?).

To answer my own question I took a shot of a ball park calculation. I hope I did not make any silly error here:
=====================
Meteorite energy at 20 km/sec ~= 200,000,000 joules/kg
100 – 1000 metric tons/day estimated meteorite mass striking the earth
100 metric tons/day = 100,000 kg/day, 1000 metric tons/day = 1,000,000 kg/day
At 100,000kg/day x 200,000,000 joules/kg = 2^13 Joules /day = 231,400,000 watts/day = 9,641,667 watts/hr
The Earth’s cross sectional area = π×radius2 = 49.3 million square miles (128,000,000,000 M^2).
Solar isolation on the atmosphere ~= 1,366 watts per square meter or about 1.75×10^12 W
1.75×10^12 W x 24 = 4.19635 x 10^13 watt hours
231,400,000 watt hours / 4.19635 x 10^13 watt hours ~= 0.0000055 ~= 0.00055% of solar energy input
At 1,000,000 kg/day = 2,314,000,000 watts/day
2,314,000,000 watt hours / 4.19635 x 10^13 watt hours ~= 0.000055 ~= 0.0055% of solar energy input.
Larry

Terry Ward
January 25, 2009 12:55 pm

hotrod (11:14:43) :
Lots of water every year falls on us from “out there”.
Also-
Plasma. Bubbles, wind, stream, what you will;
http://mcf.gsfc.nasa.gov/Fok/PUA1911.pdf
http://fenyi.sci.klte.hu/publ/Praga2002.pdf
http://www.agu.org/pubs/crossref/2003/2002JA009690.shtml
http://www.souledout.org/magfieldsaudio/magfields.html#plasma

January 25, 2009 1:37 pm

E.M.Smith (07:37:18) :
The part I’m not sure about is the precedence order of evaluation and the overall impact on final validity of the temp in C.tenths. The code is:
if(temp.gt.-99.00) itemp(m)=nint( 50.*(temp-32.)/9 ) ! F->.1C
Which looks for a ‘data missing flag’ of -99 and as long as valid data are in the field, converts the temp (a real data type in -xxx.x F format) into itemp (an integer data type in XXXX C format, with the last digit being tenths, for each month m [do loop counter is m 1,12] )
I learned FORTRAN under F77 rules long ago and don’t remember a ‘nint’ function (this code is F90), but I do remember that 9 is an int while 9. is a float. So we have -32.(float) and 50.*(float) but /9 int …
This all leaves me a bit unsure what the actual computation would be (with type conversions) and what the actual outcome would be. When I learned it, the type conversions were automatic (and often unplanned…) while ‘nint’ looks like a type conversion function wrapper. Any ideas welcome…

NINT is a standard Fortran (also F77) intrinsic function meaning ‘nearest integer’. It is used for conversion between floating point numbers ant integers. It rounds up or down depending of the value. A similar intrinsic function INT simply truncates the decimals. So NINT(value) = INT(value + 0.5)
Using an integer in the denominator is sloppy practice and may have led to different results for different implementations, but I guess it is safe to assume today that since the numerator is a floating point number, the result will be as if the integer 9 constant was a floating point 9. constant. So the rounding is only in the NINT.
So the result is a Celsius * 10 integer, meaning the value is rounded to the nearest 0.1C
This tiny detail is illustrating why and how it can be hard to read other people’s code and why an open source project might be a good idea.
Referring to a previous comment someone made, I think the idea would not be to recreate GISS, but to make something that people thought made sense and was well documented, transparent and easily available for inspection by interested parties. Then if the outcome of such algorithms don’t agree with GISS etc., you have an opportunity to find out why.

Editor
January 25, 2009 1:46 pm

E.M.Smith (07:37:18) :

if(temp.gt.-99.00) itemp(m)=nint( 50.*(temp-32.)/9 ) ! F->.1C
I learned FORTRAN under F77 rules long ago and don’t remember a ‘nint’ function (this code is F90), but I do remember that 9 is an int while 9. is a float. So we have -32.(float) and 50.*(float) but /9 int …

http://www.nsc.liu.se/~boein/f77to90/a5.html says nint is in F77. It rounds to the nearest integer. (int() rounds toward 0 (down for positive number, up for negative. Yuck.)
Most languages will take “real op int” and convert the int to a real.

Robert Bateman
January 25, 2009 2:06 pm

Then the meteoric mass falling on Earth is along the lines of AGW CO2…miniscule by comparison.
The radioactive decay should be rather constant (and falling slowly over time).
Electrical currents induced in the core, mantle and atmosphere by the effects of solar wind and solar storms buffeting the magneto sphere.
This one is clearly solar forced, or lack of it.
Just seems to me that Solar is our big gun in this process…so far.
I’m positive that if we keep digging at it and exploring how that process works, that we shall be able to define it.
Once that is done, whatever discrepancy remains we will then know how much is still missing.. Like the way the missing mass has driven more science to find dark energy and dark matter.

hotrod
January 25, 2009 3:31 pm

Electrical currents induced in the core, mantle and atmosphere by the effects of solar wind and solar storms buffeting the magneto sphere.
This one is clearly solar forced, or lack of it.
Just seems to me that Solar is our big gun in this process…so far.

Although it is obviously derived from the sun, it is a “hidden input” as most people focus on the direct radiant energy from the sun (light and IR) when they are talking about “solar input. Recently we are talking about indirect effects like the changes in cloud developement due to the solar effects on the magnetosphere, but I have not heard anyone try to tally up a value for how much energy is coupled directly by magnetic/electrical effects to the atmosphere itself, when they are talking about solar input.
I have no clue how big the actual energy inputs might be, but induction heating is used to melt scrap iron, if strong coupling exists it is conceivable that large amounts of energy could be transferred in a non-obvious way. It is possible, that the electrical current heating induced by solar storms both in the body of the earth and the atmosphere might be significant compared to the direct radiant heating from the solar irradiance.
I have never seen any sort of limits placed on this mechanism, or any measurements of it made.
The ground currents developed by lightning strikes are significant and obviously produce local heating. Likewise I am not aware of anyone trying to put a value on direct heating of the atmosphere by lightning. This might in the grander scheme of thing be a gnat sneeze in a hurricane, or the world wide power dissipation for thunderstorms might be pretty impressive. It if nothing else is a way of moving energy from high altitudes to low altitudes, as static electrical energy harvested from rising water crystals, eventually gets released as lightning inside the cloud and to ground as these space charges try to equalize.
I know that the ground currents induced by solar storms can also be significant! I would not want to pay the power bill for the northern lights if they were being powered off the grid. That power eventually degrades to heat! How much heating is generated in the atmosphere by the kinetic energy of ions and electrical currents in the auroral displays?
Could those sudden stratospheric heating episodes be due to electrical energy being degraded to heat?
People forget how little we know of the electromagnetic activity in the atmosphere. It was just a few years ago that “sprites” and “elfs” were discounted as unproven or optical illusions. They were found to be impressively powerful events.
http://elf.gi.alaska.edu/
I submit that it is at least possible that a considerable amount of energy is coupled directly into the atmosphere and surface of the earth due to magnetic effects. Perhaps they are not included in the energy accounting, because no one has thought to quantify them, and add them to the earths energy budget. If they are trivial, than like Edison learning from tests that showed what did not work, we have eliminated one other possible mode of power transfer to the earths weather systems.
Larry

Corrinne Novak
January 25, 2009 3:34 pm

The USA now has a President that made the following statement:
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology.” Barack Obama
And recently he anounced (sic) that he will communicate with all Americans directly via e-mail.
What could go wrong?
If you want to know whether Obama is really listening to the American Public or if he is listening to Al Gore follow what happens to NAIS (animal ID) The USDA wants to regulate farming with fines up to $500,000 and 10 years in jail so the issue is almost as critical as the carbon dioxide tax. Out of the top ten or so Ag issues, three were “Stop NAIS” and a couple more were support small farms/ farm freedom. “Protect Our Food Supply – Stop NAIS!” Actually made into the top 25 despite a complete media blackout and the lies spread by the USDA.
http://www.change.org/ideas
http://libertyark.net/
This gives me hope that the internet can, with the help of the sun, counter the “Global Warming Hoax” too. Al Gore is also anti-American farming.
This comes from the Ag Journal, Billings, Montana: “At a recent ceremony at the White House, Vice President and presidential candidate Al Gore let slip what many have long believed was his real intention as regards to U.S. agriculture.
“While presenting a national award to a Colorado FFA member, Gore asked the student what his/her life plans were. Upon hearing that the FFA member wanted to continue on in production agriculture, Gore reportedly replied that the young person should develop other plans because our production agriculture is being shifted out of the U.S. to the Third World.”
http://showcase.netins.net/web/sarahb/farm/
I wonder what Gore thinks American citizens are supposed to do for a living once he shuts down manufacturing, farming and outsources computer programming?

Bill Illis
January 25, 2009 4:50 pm

To Robert Bateman (10:14)
One potentially significant source of heat (which is almost never talked about and could be an alternative theory to the greenhouse effect itself) is “gravitational compression.”
What makes a star heat up and compress so that nuclear fusion is possible?
What makes Venus so hot? Why is Jupiter 20,000K at its core? Why is Mars colder even though it has so much CO2 in its atmosphere?
The density of the atmosphere itself produces a gravitational compression and this produces heat (beyond the compression which exists in the mantle and in the core). The weight of the atmosphere itself produced a warming and acts a non-greenhouse blanket keeping the heat in.
The density of the atmosphere itself is a heating and heat-trapping mechanism (disregarding any impact from the Sun.)
This theory has more explanatory power for the various temperatures seen around the solar system and in the stars. Of course, EM radiation from the Sun comes in and leaves the planet and the rates at which this happens is modulated by the absorption frequencies of the different molecules so the greenhouse effect has to also play a role.

Ron de Haan
January 25, 2009 6:55 pm

Corrinne Novak (15:34:55) :
“The USA now has a President that made the following statement:
“Because the truth is that promoting science isn’t just about providing resources — it’s about protecting free and open inquiry. It’s about ensuring that facts and evidence are never twisted or obscured by politics or ideology.” Barack Obama
And recently he anounced (sic) that he will communicate with all Americans directly via e-mail.
What could go wrong?
If you want to know whether Obama is really listening to the American Public or if he is listening to Al Gore follow what happens to NAIS (animal ID) The USDA wants to regulate farming with fines up to $500,000 and 10 years in jail so the issue is almost as critical as the carbon dioxide tax. Out of the top ten or so Ag issues, three were “Stop NAIS” and a couple more were support small farms/ farm freedom. “Protect Our Food Supply – Stop NAIS!” Actually made into the top 25 despite a complete media blackout and the lies spread by the USDA.
http://www.change.org/ideas
http://libertyark.net/
This gives me hope that the internet can, with the help of the sun, counter the “Global Warming Hoax” too. Al Gore is also anti-American farming.
This comes from the Ag Journal, Billings, Montana: “At a recent ceremony at the White House, Vice President and presidential candidate Al Gore let slip what many have long believed was his real intention as regards to U.S. agriculture.
“While presenting a national award to a Colorado FFA member, Gore asked the student what his/her life plans were. Upon hearing that the FFA member wanted to continue on in production agriculture, Gore reportedly replied that the young person should develop other plans because our production agriculture is being shifted out of the U.S. to the Third World.”
http://showcase.netins.net/web/sarahb/farm/
I wonder what Gore thinks American citizens are supposed to do for a living once he shuts down manufacturing, farming and outsources computer programming?”
Corrinne Novak,
It must be clear that most of us are extremely skeptical in regard to the current developments, the financial and economic crises, the “War on Carbon Fuels”, the tax measures effecting the food productions and all other plans.
If these measures result in a failed policy it will not take much time before people start paying with their lives.

Robert Bateman
January 25, 2009 7:13 pm

Gravitational compression would seem to me to be a constant, easily calcuable. It’s effect then would be establish a nightime minimum as in a polar night, given a constant atmosphere content.
As for Mars, while it’s atmosphere is almost wholly CO2, it hasn’t much of an atmosphere, hence it is far colder for it’s distance for the Solar heating it recieves, which it doesn’t do a great job of holding onto.
Venus has an extremely thick atmosphere of mostly C02. It is far warmer than it’s distance.
The two planets above would be far different if thier atmosphere were identical to Earths, and so would the Earth be far different if it had a CO2 atmosphere.
For a planet with almost no atmosphere, Mercury is a good test case for TSI and gravitational compression.
So, take the difference between Mercury’s nighttime temp and the cold of space, scale up to Earth size, and you have a reasonable approximation of Earth’s gravitational compression heating.
Lack of magnetic field will also help isolate in the Mercury case.

Robert Bateman
January 25, 2009 7:22 pm

I have to say, Bill Illis, that I like very much your line of thinking. Use the other planets to help solve for Earth. It sure beats the tar out of monkeying with data to further your cut of the political pork barrel pie, which at the end of the day is not Science, but simply lining up at the nearest politically correct feeding trough.

Roger Carr
January 26, 2009 3:20 am

braddles (03:32:15) wrote: “Will people talking about computer languages and other irrelevancies go away please?”
No! Braddles… please! I find all this talk fascinating, and am filled with awe and admiration for those posting. It may even lead to a project which drags (no doubt kicking and screaming) many hot air balloons into Century 21, as implications in the posts being made seem to indicate a whole lot of the “climate science” we are getting is coming from hand-cranked codes and coders of yesterday.
Let the future take fire! Even better; let me watch the first sparks here.

January 26, 2009 5:04 am

John Finn (09:36:25) :
….and you can’t deny that SC19 (the strongest cycle ever recorded) was also during a cool period.
Granted, that is indeed a mystery still unsolved perhaps. I have been searching for something concrete on atomic testing through the late 40’s and 50’s but without much success, but did come across some old forum posts on the late John Daly’s site where Dr. Landsheidt commented on a similar discussion. He was of the opinion that nuclear testing might well explain the discrepancy.

hotrod
January 26, 2009 9:06 am

Granted, that is indeed a mystery still unsolved perhaps. I have been searching for something concrete on atomic testing through the late 40’s and 50’s but without much success, but did come across some old forum posts on the late John Daly’s site where Dr. Landsheidt commented on a similar discussion. He was of the opinion that nuclear testing might well explain the discrepancy.

Just what sort of info are you looking for? The complete list of above ground nuclear tests has been available since 1962 in documents published by the U.S. Government.
===================
1 Trinity 16/7/45 Alamgordo New Mexico — tower shot — 19 kt yield
2 Combat 5/8/45 Hiroshima Japan ——– air burst —– nominal yield
3 Combat 9/8/45 Nagasaki Japan ——— air burst —– nominal yield
4 Able 30/6/46 Bikini atoll ————- air burst —— nominal yield
5 Baker 24/7/46 Bikini atoll ——– underwater burst —- nominal yield
6 x-ray 14/4/48 Eniwetok ————– tower shot —– 37 kt yield
7 yoke 30/4/48 Eniwetok ————– tower shot —– 49 kt yield
8 zebra 14/5/48 Eniwetok ————– tower shot —– 18 kt yield
9 Able 27/1/51 Nevada test site ———- air burst —– 1 kt yield
10 Baker 28/1/51 Nevada test site ———- air burst —– 8 kt yield
11 Easy 1/2/51Nevada test site ———- air burst —– 1 kt yield
12 Baker-2 2/2/51 Nevada test site ———- air burst —– 8 kt yield
13 Fox 6/2/51 Nevada test site ———- air burst —– 22 kt yield
14 Dog 7/4/51Eniwetok ————– tower shot —– not specified
15 Easy 20/4/51Eniwetok ————- tower shot —– 47 kt yield
16 George 8/5/51 Eniwetok ————– tower shot —– not specified
Nominal yield indicated wartime designs intended to yield approximately 20 KT yield
The list goes on —
Source see:
The Effects of Nuclear Weapons
Samual Glastone Feb 1964
pages 671- 677 Appendix B
Was available from Government Printing Office
The book was republished in 1977 without the detonation listing appendix.
Larry

January 26, 2009 1:18 pm

hotrod (15:31:34) :
How much heating is generated in the atmosphere by the kinetic energy of ions and electrical currents in the auroral displays?
a few tens of GigaWatt. E.g. http://www.swpc.noaa.gov/pmap/

hotrod
January 26, 2009 3:06 pm

How much heating is generated in the atmosphere by the kinetic energy of ions and electrical currents in the auroral displays?
a few tens of GigaWatt. E.g. http://www.swpc.noaa.gov/pmap/

Thanks!
So that works out to a fraction of a percent of the suns total irradiance, around about 0.1% give or take.
Larry

Robert Bateman
January 26, 2009 3:07 pm

hotrod:
Check out Tsar Bomba. The Russians exploded a 50 megaton monster in 1961 that scared the bejesus out of the whole world. They sent copies of the filming around the world, wanted everyone to know what they could do. They even managed to sober themselves up. It had to have put out significant heat, and I’m sure you can find other tests they did. They were not shy about the size of the bombs they tested.
REPLY: Lets steer back to the thread please. – Anthony

January 26, 2009 4:53 pm

Granted, that is indeed a mystery still unsolved perhaps. I have been searching for something concrete on atomic testing through the late 40’s and 50’s but without much success, but did come across some old forum posts on the late John Daly’s site where Dr. Landsheidt commented on a similar discussion. He was of the opinion that nuclear testing might well explain the discrepancy.
It doesn’t bother you, then, that 2 recent cycles for which there is good quality data in terms of both temperature and solar observations have failed to produce the expected results.

George E. Smith
January 26, 2009 5:21 pm

Just a question since I haven’t read this thread before.
This AP index; as in planetary index. is the planetary simply a reference to earth or is this more generic involving other planets? i.e. is it an “earth” index ?
George

January 26, 2009 7:33 pm

hotrod (15:06:51) :
“a few tens of GigaWatt.”
So that works out to a fraction of a percent of the suns total irradiance, around about 0.1% give or take.

Do the math , Larry. The surface of the disk intercepting TSI is 3.1416 * 6370000 m ^2 = 1.25e14 m^2. TSI is 1361 W/m^2, so total intercept is 1.25e14*1.361e3=1.7e17. A few tens of GigaWatt, say 50 for sum of both hemispheres, is 5e10 W, so fraction is 5e10/1.7e17= 3e-7, or 0.00003%.

hotrod
January 26, 2009 8:33 pm

Ooops !
Looks like I fumble fingered the calculator there — thanks Leif!
Larry

January 26, 2009 9:28 pm

hotrod (20:33:45) :
Looks like I fumble fingered the calculator there
Perhaps a bit of wishful thinking there 🙂
The effect is really negligible as all the other electrical/magnetical/induction proposals that have been brought up. A few minutes at a summer noon in the Mojave Desert should convince anybody.

January 26, 2009 11:42 pm

John Finn (16:53:46) :
It doesn’t bother you, then, that 2 recent cycles for which there is good quality data in terms of both temperature and solar observations have failed to produce the expected results.
I would not expect temp to follow solar activity precisely, as we have discussed there are other factors involved. What do you think caused a drop in temps after 1940?

January 27, 2009 12:03 am

E.M.Smith (08:26:50) :

Frankly, it’s the sort of thing best done with a database and simple report language.

I do a lot of Oracle programming – I’d like to try this. Maybe we can get a couple of different programming tracks going, one using a database with SQL and maybe a Java version with flat files, so we can compare results.
Can you contact me? You can reach me at bionuclearguy at gmail dawt com.

January 27, 2009 2:16 am

I would not expect temp to follow solar activity precisely, as we have discussed there are other factors involved. What do you think caused a drop in temps after 1940?
Ocean shifts.

January 27, 2009 3:22 am

John Finn (02:16:06) :
Ocean shifts.
And where does the ocean gets it heat from?

Editor
January 27, 2009 4:22 am

Carsten Arnholm, Norway (13:37:34) :
NINT is a standard Fortran (also F77) intrinsic function[…]A similar intrinsic function INT simply truncates the decimals. So NINT(value) = INT(value + 0.5)

Thanks! This actually caused me to think a minute (foreign as that sometimes it to my brain 😉 and I realized that while I had done some maintenance on F77, I actually had FORTRAN IV in class! I remembered the INT trick.
Ric Werme (13:46:23) :
It rounds to the nearest integer.

Thanks as well! I’m slowly getting my “FORTRAN legs” back, seasick though it has made me 😉

January 27, 2009 9:23 am

Ric Werme (13:46:23) :
It rounds to the nearest integer
What is the integer nearest to 2.5 ?
to -2.5 ?

January 27, 2009 9:33 am

Leif,
You didn’t put in Ric’s whole quote: “It rounds to the nearest integer. (int() rounds toward 0 (down for positive number, up for negative.”
Depending on the sign, the nearest integer would be either +2 or -2. Wouldn’t it?
[PS: I know you were just being funny with ‘integer’ -2.5.]

Peter Salonius
January 27, 2009 9:43 am

I hope Leif Svalgaard or one of the other people taking part in this discussion can help with a gnawing problem I have had for some time.
Looking at the (admittedly flawed) Vostok ice core data (trace of temperatures during the last ~400,000 years) it would seem that deglaciation/warming occurs much faster than glaciation/cooling –BUT I can not figure out why the warming is so STEEP and the cooling is so gradual.
Peter Salonius

Steve M.
January 27, 2009 11:43 am

Break out your magnifying glass…there seems to be a tiny speck forming in the southern hemisphere!
How desperate am I to be looking so carefully??????

January 27, 2009 11:53 am

Steve M. (11:43:43) :
Break out your magnifying glass…there seems to be a tiny speck forming in the southern hemisphere!
and a SC23 speck at that.

Editor
January 27, 2009 1:05 pm

PD Park (00:03:41) : Can you contact me?
Done.

Editor
January 27, 2009 1:13 pm

Peter Salonius (09:43:47) :
Looking at the (admittedly flawed) Vostok ice core data (trace of temperatures during the last ~400,000 years) it would seem that deglaciation/warming occurs much faster than glaciation/cooling –BUT I can not figure out why the warming is so STEEP and the cooling is so gradual.

My guess, and that’s all it is, would be that the air can warm quickly even if a glacier is near by, then warm rain can melt the ice fast; but to make the glacier takes may years of snowfall… (Warm air and rain can deliver heat at a rate faster than cool air and snow can deliver physical ice).

Pamela Gray
January 27, 2009 1:56 pm

I see it too. And I also see that it is from 23. I saw the area this morning and wondered if it would produce a spot.

January 28, 2009 12:49 am

And where does the ocean gets it heat from?
From the sun – but any additional forcing could have been from decades earlier. Also it might be a re-distribution of heat rather than additional heat.

Ron de Haan
January 28, 2009 1:03 am

O.T and just for the record:
Another cycle 23 sunspot is building according http://www.spaceweather.com/

January 28, 2009 6:08 am

John Finn (00:49:58) :
And where does the ocean gets it heat from?
From the sun – but any additional forcing could have been from decades earlier. Also it might be a re-distribution of heat rather than additional heat.
I suspect all the climate drivers are feeding from the Sun in different ways and use that energy in different time scales. I have raced Go karts for several years and have learned, its the sum of small things that make you go fast. Small changes in solar output taken up by many drivers probably determine our climate. There is no other heat source apart from under our feet.
In my opinion, continued research on what regulates the Sun remains one of science’s most important projects.

gary gulrud
January 28, 2009 6:52 am

“Another cycle 23 sunspot is building”
I guess I played 23’s dirge prematurely. Two specks in as many weeks!

Pamela Gray
January 28, 2009 7:12 am

The Earth is not a “small change” entity. My hunch is that given the fact that water circulates around the globe, and it takes a bit to go ’round, air circulates around the globe, ozone circulates around the globe, water vapor circulates around the globe, and all the other stuff in and around our planet circulates and is not well mixed, the Sun is the constant source while the Earth owns the rather variable weather drivers.
How long did that spot last? Did it last as long as other specks (23 or 24) that got numbers?

gary gulrud
January 28, 2009 9:16 am

Chile is experiencing more activity. And Chaiten threatens GW: “Moreover, there has been a predominance of water vapour and volcanic gases of the H2S type”

Hal Romans
January 29, 2009 6:04 am

There are several posts on this site about solar effects on the weather / climate and I feel I’m missing something basic. From what I gather, low solar activity has some correlation to lower temperatures. Is this correct? What exactly does this have to do with the weather / climate effects of CO2 (if any)? Thanks.

January 29, 2009 9:26 am

Look at this: Pavel Hejda, Ivanka Charvátová, Jaroslav Střeštík http://geomag.usgs.gov/iagaxiii/posters/Variability_and_predictability_of_geomag_activity_Hejda.pdf
“…further evidence supporting the idea that the SIM modulates and governs
the geomagnetic activity (the aa-index)”

Jim
January 29, 2009 1:00 pm

Regarding the negative effective-sunspot-number (SSNe) on our web site. The SSNe is derived by fitting a climatological model of the ionosphere to global ionospheric conditions. The fitting is accomplished by adjusting the SSN fed to the model (its only input other than day-of-year and hour-of-day), and the SSN that provides for a zero mean difference between model and observations is the SSNe. When SSNe is negative, that just means the global ionosphere is at lower density than was observed globally when the SSN was zero in the data set used to generate the climatology. Second order effects fold in from geomagnetic disturbances and spatial distribution (of input data) effects.

Editor
January 30, 2009 4:10 am

Hal Romans (06:04:45) :
There are several posts on this site about solar effects on the weather / climate and I feel I’m missing something basic. From what I gather, low solar activity has some correlation to lower temperatures. Is this correct? What exactly does this have to do with the weather / climate effects of CO2 (if any)? Thanks.

The summary would be: The sun is where we get most all of the energy that drives the climate system. There are some folks who think that variation in solar output (beyond just the light) can make us hotter or colder. Other folks don’t see a mechanism for this, so they look elsewhere. Low sunspots have a fairly well demonstrated correlation with cooler times, but not enough to prove causality. The best mechanism I’ve seen proposed is this:
Lower sunspots (so less solar output) gives a trivial reduction in light, but the magnetic field and some parts of the UV can drop much more. This weak mag field lets more cosmic rays hit the earth (CRF or GCR Cosmic Ray Flux or Galactic Cosmic Rays). These make more clouds (like in a ‘cloud chamber’ used in nuclear physics labs). At the same time, the lower UV makes less ozone and the GCR’s help break down what there is.
Ozone blocks the 9 to 10 micrometer IR spectrum, so if there is less of it, more heat can radiate out into space. More clouds in the tropics with less sunshine means less heat in, while less O3 means more heat out. You cool off.
Issues:
The GCR / cloud (Svensmark) theory is relatively new and needs some observation and testing to confirm it. The ozone theory is also fairly new. It’s a nice theory, but we don’t have confirmation yet.
The sun doesn’t shut down all that much. Trivial percentages. For this to work there has to be rather a lot of amplification in the process.
There are times in the record of sunspots (via proxies in some cases) where the sunspots have gone low and the climate didn’t get much cooler. Other times things get cool and sunspots are still high. There is some randomness in the process.
The weather and climate effects of CO2 are largely unrelated to the solar theories. To some extent they are antagonistic theories. (At least the proponents can be antagonistic ;-0
The only connection I can think of is that the IR loss of heat from the planet runs into a wall of absorbing gasses in the air. Each gas plugs up some parts of the spectrum. Water vapor and CO2 have a lot of overlap, so add CO2, but lose water: nothing much happens. Add water or CO2 and it’s already mostly blocked. But Ozone is all most all alone at the 9-10 micrometer range. To the extent that it is a greenhouse gas, the GHG theories may be right, but have the wrong gas (O3 not CO2) and the wrong ‘driver’ (Sun not Us). CO2 and Ozone (and thus solar output) are connected in the GHG blanket theory.
Add to this long duration cycles in fluids and heat on the planet (like ocean hot spots with names like PDO, ENSO, AMO…) and you can have 30 year ocean cycles and 11 year solar cycles interacting. This makes it very hard to sort out!

Alphajuno
January 30, 2009 6:34 am

This is OT but there is an interesting article in the March issue of Astronomy magazine about gravity and what is not known about it. For example, the Astronomical Unit (AU) distance is increasing about 23 feet per century. So if we wait long enough all of our alleged warming problems will cease (tonque-in-cheek).

January 30, 2009 5:17 pm

Adolfo Giurfa (09:26:42) :
Look at this: Pavel Hejda, Ivanka Charvátová, Jaroslav Střeštík http://geomag.usgs.gov/iagaxiii/posters/Variability_and_predictability_of_geomag_activity_Hejda.pdf
“…further evidence supporting the idea that the SIM modulates and governs
the geomagnetic activity (the aa-index)”

That paper was rejected during peer-review for inclusion in the proceedings of that meeting.

Editor
January 31, 2009 5:29 pm

Leif Svalgaard (17:17:01) :
Adolfo Giurfa (09:26:42) :
Look at this: Pavel Hejda, Ivanka Charvátová, Jaroslav Střeštík http://geomag.usgs.gov/iagaxiii/posters/Variability_and_predictability_of_geomag_activity_Hejda.pdf
“…further evidence supporting the idea that the SIM modulates and governs
the geomagnetic activity (the aa-index)”
That paper was rejected during peer-review for inclusion in the proceedings of that meeting.

Any idea why? Or are reasons for rejection kept quiet? I saw some, er, non-standard English that needed a polish, and it looks like yet another ‘we found a correlation, no causality though’; would that be enough?
It will be interesting to see if their predictions of aa are at all close…

Mike Bryant
January 31, 2009 5:47 pm

Pamela,
“The Earth is not a “small change” entity. My hunch is that given the fact that water circulates around the globe, and it takes a bit to go ’round, air circulates around the globe, ozone circulates around the globe, water vapor circulates around the globe, and all the other stuff in and around our planet circulates and is not well mixed, the Sun is the constant source while the Earth owns the rather variable weather drivers.”
It’s mind-boggling isn’t it? Fortunately the GCMs have it all figured out…
Mike Bryant
(sarc/off)
Very well put…

January 31, 2009 6:18 pm

E.M.Smith (17:29:03) :
“That paper was rejected during peer-review for inclusion in the proceedings of that meeting.”
Any idea why?

I can think of several:
1) The idea that solar activity is determined by planetary alignments [or patterns of solar ‘motion’] has been described in the literature as the authors note [but mostly by themselves] and one should directly correlate sunspot numbers with the motions, rather than the indirect measure that the aa-index is.
2) Does the contribution acknowledge and cite previous work of relevance?
No, it mainly lists the authors’ own work, except for the standard
references to Mayaud and the like. There are no references to works critical of the SIM ideas [perhaps mostly because scientists do not bother comments on this type of papers being ‘not even wrong’]
3) Is the content new and original? No, similar ideas have been put forward by Landscheidt and by Fairbridge, and others.
4) The suggestion in the paper that support for this idea would improve if just the data had higher quality begs the question of the viability of the idea in the first place.
5) The paper hangs on the coincidence of the ‘good’ fit in Figure 7, but the two 4th-order polynomia have only three degrees of freedom so it is hard to become excited about that the ‘match’. They certainly do not match in size of the aa-index [likely due to aa being wrong in the first place]

February 1, 2009 4:25 am

Adolfo Giurfa (09:26:42) :
Look at this: Pavel Hejda, Ivanka Charvátová, Jaroslav Střeštík http://geomag.usgs.gov/iagaxiii/posters/Variability_and_predictability_of_geomag_activity_Hejda.pdf
“…further evidence supporting the idea that the SIM modulates and governs
the geomagnetic activity (the aa-index)”

This report is spot on and is rejected by those who refuse to see the facts. If Hejda et al went back further they would see the Sun’s path goes into “chaotic” mode nearly every 172 years depending on angular momentum strength…this is obvious in my latest results.
http://landscheidt.auditblogs.com/archives/95
Here is a chart showing the Suns path in current time, as you can see its going into the “chaotic” phase at the exact point where Neptune & Uranus cause the disturbance in Carl’s Angular Momentum graph (right now), just as it is in the Hejda et al paper. This will most likely bring on a Grand Minimum and a certain drop in aa activity. The chaotic path happens every 172 years if the angular momentum is strong enough (most times), my research shows this happening for the past 11000 years and happens every time Neptune & Uranus come together.
http://landscheidt.auditblogs.com/files/2009/02/carsten.jpg
http://landscheidt.auditblogs.com/files/2008/12/sunssbam1620to2180gs1.jpg
The evidence is mounting for planetary influence, many papers that may have been rejected in the past will be revisited after this Grand Minimum.

February 2, 2009 2:36 pm

nobwainer (Geoff Sharp) (04:25:58) :
This report is spot on and is rejected by those who refuse to see the facts. If Hejda et al went back further they would see the Sun’s path goes into “chaotic” mode nearly every 172 years depending on angular momentum strength…this is obvious in my latest results.
A current scientific view of Grand Minima:
Solar Phys. DOI 10.1007/s11207-008-9293-6
Grand Minima of Solar Activity and the Mean-Field Dynamo
I.G. Usoskin · D. Sokoloff · D. Moss
Received: 1 September 2008 / Accepted: 9 November 2008
Abstract We demonstrate that a simple solar dynamo model, in the form of a Parker migratory dynamo with random fluctuations of the dynamo governing parameters and algebraic saturation of dynamo action, can at least qualitatively reproduce all the basic features of solar Grand Minima as they are known from direct and indirect data. In particular, the model successfully reproduces such features as an abrupt transition into a Grand Minimum and the subsequent gradual recovery of solar activity, as well as mixed-parity butterfly diagrams during the epoch of the Grand Minimum. The model predicts that the cycle survives in some form during a Grand Minimum, as well as the relative stability of the cycle inside and outside of a Grand Minimum. The long-term statistics of simulated Grand Minima appears compatible with the phenomenology of the Grand Minima inferred from the cosmogenic isotope data.We demonstrate that such ability to reproduce the Grand Minima phenomenology is not a general feature of the dynamo models but requires some specific assumption,
such as random fluctuations in dynamo governing parameters. In general, we conclude that a relatively simple and straightforward model is able to reproduce the Grand Minima phenomenology remarkably well, in principle providing us with a possibility of studying the physical nature of Grand Minima.

psi
February 2, 2009 5:25 pm

@nobwainer (Geoff Sharp) (04:25:58)
Geoff —
It is fascinating to watch the interplay between your models and real developments in world climate as we speak. Are there any other models that predict an imminent minima?
Keep up the great work, keep your “cool” under the provocation of those who dismiss the sun’s role as the prime driver, and keep us all informed as the work develops. Assuming your current predictions hold, which to my layman’s untutored mind seems to be unfolding, you and Carl et al are about to have your day in court.
@gary gulrud (06:52:06) :
“Another cycle 23 sunspot is building”
I guess I played 23’s dirge prematurely. Two specks in as many weeks!

Things looking cooler and cooler, aren’t they?
Cheers,
psi (an interloper from the humanities)

February 2, 2009 6:31 pm

psi (17:25:01) :
Keep up the great work, keep your “cool” under the provocation of those who dismiss the sun’s role as the prime driver, and keep us all informed as the work develops. Assuming your current predictions hold, which to my layman’s untutored mind seems to be unfolding, you and Carl et al are about to have your day in court.
Thanks for your support, there are a few predicting an imminent grand minimum, but none using angular momentum controlled by Neptune & Uranus as far as I am aware. I think in time science will accept the inevitable conclusion that the planets do control our Sun, the evidence is building by the day. Perhaps we need some input from the Astronomy section of science who might be better equipped to understand the principles involved.
Updates are happening daily at the moment with so much information coming to hand.
http://landscheidt.auditblogs.com/archives/95

Editor
February 5, 2009 8:22 pm

Leif Svalgaard (14:36:40) :

nobwainer (Geoff Sharp) (04:25:58) :“chaotic” mode nearly every 172 years depending on angular momentum strength…

[…]
A current scientific view of Grand Minima:
Abstract We demonstrate that a simple solar dynamo model, in the form of a Parker migratory dynamo with random fluctuations of the dynamo governing parameters and algebraic saturation of dynamo action, can at least qualitatively reproduce all the basic features of solar Grand Minima […] We demonstrate that such ability to reproduce the Grand Minima phenomenology is not a general feature of the dynamo models but requires some specific assumption, such as random fluctuations in dynamo governing parameters.

I don’t know what ‘dynamo governing parameters’ are, but is there no room here for delta angular momentum / dt to induce some ‘randomness’ into them?
Is it not at all possible that both are true? The basic mechanism is as described in this article, but a bit of randomness comes from the orbital partners via tides et.al.?

February 5, 2009 9:19 pm

E.M.Smith (20:22:25) :
I don’t know what ‘dynamo governing parameters’ are, but is there no room here for delta angular momentum / dt to induce some ‘randomness’ into them?
Is it not at all possible that both are true? The basic mechanism is as described in this article, but a bit of randomness comes from the orbital partners via tides et.al.?

The parameters governing the dynamo are things like the speed of plasma flows and the amplitude of waves. It is very possible [mandatory, in fact] that the planetary influences have some effect. The one problem is the energetics: the planetary influence is many orders of magnitude smaller than the regular solar flows and forces. An ant being run over by a truck does deflect the truck, but not by much.

Editor
February 6, 2009 7:22 pm

Leif Svalgaard (21:19:27) : The one problem is the energetics: the planetary influence is many orders of magnitude smaller than the regular solar flows and forces. An ant being run over by a truck does deflect the truck, but not by much.
Which would require postulating some significant feedback mechanism which is increasingly unlikely in direct proportion to the degree of feedback needed which makes the whole thing extremely unlikely… and puts me right back where I was before I asked the silly question in the first place. Drat.
Once the probability of something approaches the probability of ‘then magic happens’; it becomes hard to believe.
So I guess the only question remaining is: How many times will Kohai wish to test the hardness of this floor? …

Editor
February 6, 2009 11:50 pm

I know, following up you own postings is tacky… but …
E.M.Smith (19:22:49) : So I guess the only question remaining is: How many times will Kohai wish to test the hardness of this floor? …
I can’t help seeing myself saying “Oooohhh! LOOK! The Shiny Thing!”
1/2 😎

%d bloggers like this:
Verified by MonsterInsights