Guest essay by David Archibald
A couple of years ago the question was asked “When will it start cooling?” Of course solar denialists misconstrued this innocent enquiry. There is no doubt – we all know that lower solar irradiance will result in lower temperatures on this planet. It is a question of when. Solar activity is much lower than it was at a similar stage of the last solar cycle but Earthly temperatures have remained stubbornly flat. Nobody is happy with this situation. All 50 of the IPCC climate models have now been invalidated and my own model is looking iffy.
Friss-Christenson and Lassen theory, as per Solheim et al’s prediction, has the planet having a temperature decrease of 0.9°C on average over Solar Cycle 24 relative to Solar Cycle 23. The more years that pass without the temperature falling, the greater the fall required over the remaining years of the cycle for this prediction to be validated.
The question may very well have been answered. David Evans has developed a climate model based on a number of inputs including total solar irradiance (TSI), carbon dioxide, nuclear testing and other factors. His notch-filter model is optimised on an eleven year lag between Earthly temperature and climate. The hindcast match is as good as you could expect from a climate model given the vagaries of ENSO, lunar effects and the rest of it, which gives us a lot of confidence in what it is predicting. What it is predicting is that temperature should be falling from just about now given that TSI fell from 2003. From the latest of a series of posts on Jo Nova’s blog:
The model has temperature falling out of bed to about 2020 and then going sideways in response to the peak in Solar Cycle 24. What happens after that? David Evans will release his model of 20 megs in Excel in the near future. I have been using a beta version. The only forecast of Solar Cycle 25 activity is Livingstone and Penn’s estimate of a peak amplitude of seven in sunspot number. The last time that sort of activity level happened was in the Maunder Minimum. So if we plug in TSI levels from the Maunder Minimum, as per the Lean reconstuction, this is what we get:
This graph shows the CET record in blue with the hindcast of the notch-filter model using modern TSI data in red with a projection to 2040. The projected temperature decline of about 2.0°C is within the historic range of the CET record. Climate variability will see spikes up and down from that level. The spikes down will be killers. The biggest spike you see on that record, in 1740, killed 20% of the population of Ireland, 100 years before the more famous potato famine.
I consider that David Evans’ notch-filter model is a big advance in climate science. Validation is coming very soon. Then stock up on tinned lard with 9,020 calories per kg. A pallet load could be a life-saver.
David Archibald, a Visiting Fellow at the Institute of World Politics in Washington, D.C., is the author of Twilight of Abundance: Why Life in the 21st Century Will Be Nasty, Brutish, and Short (Regnery, 2014).
UPDATE:
For fairness and to promote a fuller understating, here are some replies from Joanne Nova
Willis (having finished dinner) no harm in a black box (or a series of black boxes) provided we are simplifying (aka, as you say, no over-fitting).
My point is that the technique is interesting, as is its application to this class of problem. The proof of the pudding will be in the eating, but in the domain of climate modelling I don’t see enough of this kind of discussion.
The problem is the dominance of bottom-up thinking in the modelling community. My response would be the same as that to the traveler seeking directions to Dublin “I wouldn’t start from here”. It is clear that this complex system will have parts that can be modeled on this basis, but we will make more progress by also using these kinds of “black box” technique.
So I welcome this discussion, and as you will probably have gleaned I’m surprised you too don’t also focus on this aspect, and hold your powder on whether this particular implementation adds value. IMHO this will prove too pat, but a robust discussion of the place of this and related kinds of approaches is to be encouraged.
Has anyone yet said we don’t actually know?
cynical scientst says:
June 29, 2014 at 12:07 am
Ummmm … errrr …

Simple???
w.
cynical_scientist says:
June 28, 2014 at 3:53 pm
……..
You made number of good and sensible points despite all the back yard’s flying feline fur.. Hemispheric magnetic field is rather weak at the earth’s orbit, however the Earth’s magnetic field shows similar long term trends but in the opposite direction, correlation is particularly strong in the Antarctica. The negative correlation (regardless which TSI model is used (Svalgaard’s or the earlier Lean one), gives R2 as high as 0.77.
Denying existence of something which is clearly in the data (as per Dr.S practice in this case) for either lack of understanding or any other reason, is contra productive. However the NASA-JPL’s scientist Dr.Dickey is of more open mind: http://www.nasa.gov/topics/earth/features/earth20110309.html
PMOD TSI at WoodForTrees stopped at 2011.75. Leif had mentioned there was more recent data but unfortunately I didn’t copy the link. WFT gave the FTP directory and said the file was “pmod_composite_latest.dat” but it’s not currently there.
ftp://ftp.pmodwrc.ch/pub/data/irradiance/composite/
Inside directory “DataPlots”, by the latest file date (5/20/2014), it looks like the recent stuff would be in “org2pmod_composite.dat” but that’s the composite and the different TSI series that went into the composite, and while informative it only goes to 9/17/2011.
Which makes the best remaining candidate “composite_42_64_1402.dat” which has “extended data” back to 1976 which WFT does not, and runs forward to 2/25/2014.
http://www.climate4you.com/Sun.htm is informative, with info on Ap Index and other things (and has the unfortunate Lean 2000 reconstruction), however it points to NOAA’s National Geophysical Data Center (NGDC) and an old 2003 PMOD file.
Hunting around, this looks like it should have the current info, in a directory similarly named to the cliamate4you-mentioned location: http://www.ngdc.noaa.gov/stp/space-weather/solar-data/solar-indices/total-solar-irradiance/
But aside from “Smithsonian Astrophysical Observatory 1902-1954”, nothing was updated since 2013, and nothing looks like the composite, just different series.
So is “composite_42_64_1402.dat” the most recent TSI data I want?
lsvalgaard
Cosmic rays also produces a quick electrons in the heliosphere.
A Very Local Interstellar Spectrum for Galactic Electrons
M.S. Potgieter, E.E. Vos, R.R. Nndanganeni, M. Boezio, R. Munini
(Submitted on 7 Aug 2013)
We present a new local interstellar spectrum for cosmic ray electrons over an energy range from 1 MeV to 70 GeV. Below (0.8 plus minus 0.2) GeV it has a power law form, E^-(1.55 plus minus 0.05), with E kinetic energy, which is consistent with previous studies. This is derived from comparing Voyager 1 electron data observed during 2010 with a comprehensive modulation model. However, reproducing the PAMELA electron spectrum observed at Earth during late 2006 and to address an unexpected increase in the electron spectrum between about 2 GeV and up to about 20 GeV, a spectral index of -(3.3 plus minus 0.1) instead of the reported -(3.18 plus minus 0.05) was found for this energy range. This feature cannot be caused by solar modulation or any other process inside the heliosphere.
http://arxiv.org/pdf/1308.1666v1.pdf
HAS says:
June 29, 2014 at 1:11 am
Thanks, HAS. I’m sorry, but my value threshold for a black box is much, much higher than that it result in “no harm” …
Regarding overfitting, since David has refused to reveal the secret knowledge, we have no idea how many tunable parameters there are in the model … but given that it is composed of a splitter, three filters, three multipliers, and an adder (weighted?), it’s going to be a large number. One diagram they show of a portion of the system shows no less than 11 tunable parameters.
Remember what von Neumann said about parameters and elephants, that with four tunable parameters he could model an elephant, and with five tunable parameters he’d make him wiggle his trunk … and Evan’s model has at least eleven tunable parameters.
As a result, while you may be impressed that David Evans’ model can make the elephant wiggle his trunk, me, not so much …
Best regards,
w.
If the current abortive el Nino is beginning to overturn toward a possibly much stronger La Nina then Evans’ model would appear to receive some initial support.
Yeah well that is a picture of his model not of the notch filter. The notch filter is the bit labelled “notch filter”. I offer no support for the model.
NikFromNYC says:
June 28, 2014 at 9:22 pm
David Archibald asserts: “We are well into the 21st century and, as far as I know, there are only two models with predictive ability that are still in the game – mine and David Evans’”
Yet if you just turn down the water vapor positive feedback then the climate models should do fine. It’s not that they are falsified in spirit, only in overenthusiastic and alarming sensitivity. Their need to parametrize a bit doesn’t suddenly throw good light on mere wiggle matching models.
========
You’re correct to a point but I think it goes a bit deeper than that.
http://climategrog.wordpress.com/?attachment_id=884
Volcanic forcing has been ‘tuned’ downwards, in order work with spuriously high sensitivity of the models. Not only are they tuning the model parameters, they are tuning the input data too.
There is also a long term warming effect once the volcanic “dust” settles. This is not recognised AFAIK and is being falsely attributed to GHG. see figures 3 and 10 in that post.
and this:
http://climategrog.wordpress.com/?attachment_id=955
ACE News #122 (4/30/2009) reported that the intensity of galactic cosmic ray (GCR) Fe at ~350 MeV/nuc, was ~12% greater in early 2009 than in 1997-1998 and also greater than ever before in the space age. The measurements were made with the Cosmic Ray Isotope Spectrometer (CRIS) on ACE. As 2009 progressed the GCR intensity continued to increase until approximately New Years Eve, when the intensities of major species from C to Fe were each 20% to 26% greater than in 1997-1998 (e.g., oxygen in Panel (a) at the right). Early in 2010 the intensity decreased to 1997-1998 levels.
http://www.srl.caltech.edu/ACE/ACENews/ACENews134.html
Ah Willis, if only we were simply making elephants wiggle their trunks (or their speedos for our Aussie friends).
The thought of 20 megs of error prone excel Excel fills one with horror. Could he not have used a real programming language where you can see what you are doing? One A4 is about the limit of what its safe to do in a spreadsheet – and no macros! Every single business model I have ever seen in Excel contained gross errors which use of a proper language would have avoided.
Stephen Wilde says
http://wattsupwiththat.com/2014/06/28/a-cool-question-answered/#comment-1671625
Henry@ren @Stephen Fisher
It seems you do not understand that ozone (& others) also back radiate
http://blogs.24.com/henryp/files/2011/08/Atmospheric_Transmission.png
Trenberth reports that 25-30% of direct radiation from the sun does not reach earth because it is absorbed and re-radiated by the atmosphere. Ozone on its own is responsible for about 25% of that 25%-30%. Hence, there is more back radiation (cooling) if there is more of it. However, Trenberth forgot or never knew about the peroxides and nitrogenous oxides also being formed TOA.
Although the new analysis suggests, perhaps surprisingly, that supernovae are on the whole good for life, high supernova rates can bring the cold and changeable climate of prolonged glacial episodes. And they can have nasty shocks in store. Geoscientists have long been puzzled by many relatively brief falls in sea-level by 25 metres or more that show up in seismic soundings as eroded beaches. Prof. Svensmark finds that they are what can be expected when chilling due to very close supernovae causes short-lived glacial episodes. With frozen water temporarily bottled up on land, the sea-level drops. – See more at: http://www.astrobio.net/topic/deep-space/cosmic-evolution/did-supernovas-boost-life-on-earth/#sthash.OGg8Q3wN.dpuf
HenryP says:
June 29, 2014 at 3:05 am
Stephen Wilde says
wattsupwiththat.com/2014/06/28/a-cool-question-answered/#comment-1671625
Henry@ren @Stephen Fisher
It seems you do not understand that ozone (& others) also back radiate
http://blogs.24.com/henryp/files/2011/08/Atmospheric_Transmission.png
Trenberth reports that 25-30% of direct radiation from the sun does not reach earth because it is absorbed and re-radiated by the atmosphere. Ozone on its own is responsible for about 25% of that 25%-30%. Hence, there is more back radiation (cooling) if there is more of it. However, Trenberth forgot or never knew about the peroxides and nitrogenous oxides also being formed TOA.
====
Which also relates to what I said above and:
http://climategrog.wordpress.com/?attachment_id=884
http://climategrog.wordpress.com/?attachment_id=955
This layman is tired of models though I understand the need to strive to prefect them in order to try to develop long term forecasts and understand our climate. But for now when it comes to determining global cooling or warming it seems to me the best course for the average guy is to watch the poles.
The second year the summer Arctic and Antarctic sea ice extents remain significantly above the mean during the summer months at each pole then I will start being concerned about cooling. No corruption or “adjustment” of temperature data can change those signs. It seems to me that the satellite measurements of these extents, while not perfect by any means, as has been proven in the past, are still far more reliable and less susceptible to tampering or error than temperature data at any level from any source.
Willis and Leif,
I think that now we have all gotten a pretty good grasp of your difficulties/objections to what David Evans and Jo Nova have done so far.
If David and Jo are as good as their word it will not be long before the full excel-based model is released (including full code and data). At that point both of you will be free to dissect it and/or run it with whatever data-set you choose or prefer. I struggle to see how you will be able to resist that Willis.
Until the release I don’t see how the endless argument conducted above between yourselves and others is going to achieve anything useful.
To Christopher Monckton: I think you are over-reacting. Until the model is released there is not much in what David has said that needs such a vigorous defense. You need to step outside and smell the roses as well.
By the way can I just say that most of those who have been following the series of blog posts by David and Jo seem to be comfortable with the way they are going about it, but still remain properly skeptical of the collection of hypotheses that have been advanced. That is the space that I put myself into at this stage. I look forward to the release of the model out into the wild. Once released there will be plenty of predators ready to rip into it. I hope David has braced himself for the event.
Neutral gas cloud is not at all indifferent.
http://www.srl.caltech.edu/ACE/ACENews/ACENews136.html
I am concerned about the lack of unity among skeptics, as displayed in comments here on this blog.
I have said it before: there is a simple way to determine a causal correlation of weather / warmth between the sun and earth, namely by observing the speed of the drop in temperatures over time.
here you can see my initial results, up until 2012
http://blogs.24.com/henryp/2013/02/21/henrys-pool-tables-on-global-warmingcooling/
For maximum temperatures I [we] have a warming rate of
0.036K/annum from 1974
0.028K/annum from 1980
0.015K/annum from 1990
-0.013K/annum from 2000
I have recently updated my tables to 2014
For maximum temperatures I [we] have a warming rate of
0.034K/annum from 1974
0.025K/annum from 1980
0.014K/annum from 1990
-0.008K/annum from 2000
You can critique me to say that the sample size is small (27 weather stations from each HS) but the downward trend is simply undeniable.
We also see a slight slowing down of the deceleration over the past 2 years.
[don’t be deceived that cooling is already over- we are looking at maximum temperatures – the effect is still coming]
When trying to explain this [to myself]: clearly, we can see some exponential downward trend?
Now look at this graph here:
http://ice-period.com/wp-content/uploads/2013/03/sun2013.png
It shows the field strengths of the sun.
Do you see that you can draw a hyperbolic curve from the top to the bottom and a parabolic curve from the bottom to the top that seems to reach a minimum/maximum around 2016?
Can you see that there is direct correlation with the drop in maximum temperature and the drop in field strengths on the sun?
Hence, I have been saying that decreased solar filed strength causes more of the more energetic particles to be released from the sun which convert to ozone, peroxides and nitrogenous oxides. If the atmosphere did not do this, we’d be dead. However, more ozone, peroxides and nitrogenous oxides TOA deflect more of the incoming SW radiation, especially of the UV type.
Hence the current cooling of the atmosphere and the oceans.
@Monckton, Willis, Leif, etc.
Why do any of you think any TSI is even remotely correct? The actual TSI ranges from neutron flux to atoms, in EM from gammas to radio waves. The last time I tried to estimate it (in the 80s), I couldn’t because the observations just weren’t available. I doubt very much whether our current TSI includes all known output. Why do any of you think one TSI is to die for vs. another TSI? I don’t think the data is there.
HenryP You can see how ozone absorbs ionizing radiation. It is much weaker than in winter.
http://terra2.spacenvironment.net/~raps_ops/current_files/rtimg/dose.15km.png
As the end of month of June is here, for those who wish to use the CET records to prove one thing or the another, just a brief reminder:
June is the month that clearly and indisputable shows that absolutely there was no warming (regardless of the CO2 content or emissions, notch filters, multidecadal natural or any other kind of variability ) for whole of its 350 years, of the longest and the most scrutinized temperature record that there is: http://www.vukcevic.talktalk.net/CET-Jun.htm
Over to you ….
bobl says:
June 28, 2014 at 11:49 pm
While I am making this point Leif, are you claiming the reconstruction David used is wrong for every solar cycle it is trained on
There are three pieces to the puzzle:
1) before 1978, no data, only reconstruction
2) 1978-2002, observations but with systematic errors
3) 2003-2014, observations [SORCE/TIM] that are reliable
One must splice the three pieces together to get a composite covering 1610-2014.
This must be done correctly, but even then one is hostage to the accuracy of piece #1. The Lean [2000] reconstruction is not correct [grossly in error] and the splicing of #3 to #2 is wrong.
Mr Svalgaard continues, embarrassingly, to fail to apologize to Dr Evans for having accused him of having acted near-criminally in deliberately using false TSI data. Instead, he tries to divert attention from his persisting falsehood by a variety of diversionary wriggles. One of the latest of these is his assertion that “The plot on the SORCE/TIM website is not a ‘historical record’, but a flawed reconstruction by Lean dating back to 2000”.
Actually, the reconstruction on the SORCE/TIM website to which I had provided a link was, as I had previously stated, by Krivova et al., though I suppose it is possible she might have relied on Judith Lean’s earlier work, as the IPCC itself has done in the past. I had also previously stated that the record went back some 400 years, not just to 2000 as Mr Svalgard has said. The TSI data on this official website is visibly strikingly similar to the TSI data in Dr Evans’ graph. On looking at the two graphs, there is certainly no respectable basis for Mr Svalgaard’s still-unretracted allegation that in using data remarkably close to that which is openly posted on the website from which Mr Svalgaard himself took (and then doctored) the graph on the basis of which he challenged Dr Evans’ assertion that solar activity had recently been declining.
It is really time for Mr Svalgaard to apologize to Dr Evans for his hasty and unjustifiable allegation that Dr Evans had knowingly used incorrect TSI data, and that in doing so he had acted in a fashion that was “almost fraudulent”. There is no respectable basis for any such allegation, and it must now be withdrawn.
The question whether Dr Evans is right to draw the conclusion from the data that TSI is falling in a manner that will have an effect on global temperature is one on which he may or may not be correct: as I have said earlier in this thread, the matter can be argued either way. But it should be blindingly obvious even to Mr Svalgaard, who is not known for his common sense, that there would scarcely be any advantage to Dr Evans in deliberately tampering with the data so as falsely to show a sharp decline in TSI, then to make a startling prediction that global temperature, far from remaining static as it has for the past couple of decades or rising as the usual suspects predict, will instead fall within not more than ten years, and then to say that if the temperature does not fall he will have been proven wrong.
It ought surely to be blindingly obvious even to the meanest and most knuckle-dragging intelligence that such behavior on Dr Evans’ part could not confer any conceivable advantage upon him. The fact is that Dr Evans did his best to put together a reasonable TSI dataset (close in all material respects to the Krivova historical reconstruction on the SORCE/TIM website); that one can apply his model to that or any other TSI dataset; and that if Mr Svalgaard does not like Dr Evans’ TSI dataset he will be free, within weeks, to apply any other TSI dataset to Dr Evans’ model.
By the same token, Mr Eschenbach should also apologize to Dr Evans. He too has used the word “fraudulent” of Dr Evans, this time because he would have liked Dr Evans to release his code and data before rather than after giving an outline of what his code and data are for. For heaven’s sake, stop whining. You have been plainly told all the code and data will be made fully and publicly available. Surely you can tell the difference between that honest approach and the approach of Mr Mann, who refuses to this day to release data for a paper that first appeared in 1998? If so, why did you accuse Dr Evans of being no better than Mr Mann in this regard?
Now, this bandying-about of the word “fraud”, when it is manifest that no fraud has been committed, is not the sort of language that marks out the man of science. To make an allegation of “fraud” is to suggest that criminality is present – in particular, a form of criminality that, where it is alleged, is calculated to damage the reputation of a man of science, in that the allegation is that the scientist has acted in a wilfully deceptive manner for his own profit or with the aim of causing loss to another. On any view, Dr Evans’ work does not fall within this definition.
Suppose that a group of researchers were to conduct a survey of many thousands of scientific papers to see how many of them stated their endorsement for IPCC’s notion that recent warming was mostly manmade. Suppose they marked their own datafile as saying that only 0.5% of the sample, explicitly endorsed that “consensus” notion. Suppose they then disregarded their own result, carefully failed to report it and instead reported that they had found 97.1% support for the consensus that defined. Suppose they then wrote another scientific paper, explicitly stating that they had found near-unanimous support for the consensus thus defined. Suppose that governments, acting on this supposed evidence of consensus as to the magnitude of Man’s impact on the climate, were to cite it as justification for predatory measures inflicting massive loss or hardship on people struggling to pay their fuel and power bills. Now, in UK law, there are two relevant offenses of fraud: fraud by misrepresentation, and fraud by abuse of the public trust that is expected of academic researchers for whose services taxpayers handsomely pay. I shall leave it to readers to decide whether, if the evidence I have outlined above is true, the group of researchers in question might, on the face of things, have committed fraud.
Now, suppose that a researcher abandons his former highly-paid job profiting from the global warming scam, shuts himself away unpaid for some years and comes up with a theory that, if it is wrong, will be shown to have been wrong within not more than a decade. How will he benefit from that, unless he is shown to have been right? To whom will he occasion any loss, other than to himself, if he is eventually shown to have been wrong? And what evidence is there of any deception? On examination, it turns out that there is none. Mr Svalgaard was simply incorrect in his allegation that Dr Evans had used incorrect TSI data. He might have used data that Mr Svalgaard disapproves of. It may even be that Mr Svalgaard is right to prefer one TSI dataset above another. But Mr Svalgaard has been too hasty in accusing the blameless Dr Evans of acting in an “almost fraudulent” fashion, and has been too dilatory in his persistent failure to apologize: a failure that will prove deeply embarrassing to Mr Svalgaard’s own reputation. For Mr Svalgaard has fallen well below the standard of probity expected of the true scientist.
As for Mr Eschenbach, he too should apologize. His use of the word “fraudulent” seems to have been more rhetorical and en passant than the calculated, deliberate and malicious use of the term by Mr Svalgaard, who accompanied it with an impertinent suggestion that Mr Evans had deliberately used wrong TSI data because he had an “agenda”, and has sullenly refused to apologize, repeating the libel on several occasions and demonstrating with each new libel and each new refusal to apologize and with each new diversionary tactic that he is not a scientist but a mere quack. Mr Eschenbach should appreciate that the use of such intemperate language is unscientific and ought to be avoided except where there is plain evidence of criminality, when, in the present instance, there is plain evidence of no criminality at all.
As for those who think I ought not to have made an issue of this, let them understand that the real battle in which we are all engaged is a battle to restore the use of reason to scientific discourse. At present, the world’s governing class has discovered that, thanks to the near-universal scientific and mathematical ignorance to which generations of State-controlled education has reduced the populace, it can manufacture scientific scare stories as justification for a vast centralization of power in the hands of new supra-national bodies elected by nobody.
In the end, the only defense against the extension of predatory, anti-democratic government to the global scale under the pretext of Saving The Planet is the truth itself. That is why it is essential that we should be careful to discriminate between actions that are genuinely fraudulent and actions that are not. And, though it is not easy to keep one’s temper while so much anti-science is peddled by the profiteers of doom, we should surely keep doing our best to try.