Readers may recall the contentious discussions that occurred on this thread a couple of weeks back. Both Willis Eschenbach and Dr. Leif Svalgaard were quite combative over the fact that the model data had not been released. But that aside, there is good news.
David Archibald writes in to tell us that the model has been released and that we can examine it. Links to the details follow.
While this is a very welcome update, from my viewpoint the timing of this could not be worse, given that a number of people including myself are in the middle of the ICCC9 conference in Las Vegas.
I have not looked at this model, but I’m passing it along for readers to examine themselves. Perhaps I and others will be able to get to it in a few days, but for now I’m passing it along without comment.
Archibald writes:
There is plenty to chew on. Being able to forecast turns in climate a decade in advance will have great commercial utility. To reiterate, the model is predicting a large drop in temperature from right about now:
David Evans has made his climate model available for download here.
The home for all things pertaining to the model is: http://sciencespeak.com/climate-nd-solar.html
UPDATE2:
For fairness and to promote a fuller understanding, here are some replies from Joanne Nova
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
Looks to me like it shows significant warming from 1940 to 1970. That didn’t happen. Seems to invalidate the model.
Murray:
That significant warming was projected between 1940 and 1970 does not necessarily invalidate the model. Usually, that a model was “invalidated” means that it was falsified. Usually, a climate model is non-falsifiable thus being insusceptible to being invalidated. A model is falsifiable when the events underlying this model exist. In climatology this is not usually the case.
So, it’s going to take us all the way back to the 1940s, when life on planet earth was barely possible.
Is it the same?
http://oi58.tinypic.com/289bnsg.jpg
Here you can see the exact collapse the TSI, in 2006, as Ap.
http://oi60.tinypic.com/8w0aid.jpg
What Ap you anticipate in the coming minimum?
Sorry, this graph is of Ap.
http://www.leif.org/research/Ap-1844-now.png
The last such rapid decline in magnetic activity of the sun was in the 1990. Oceans slowing effect. They can store energy from the sun.
http://www.leif.org/research/Ap-1844-now.png
lsvalgaard says:
July 8, 2014 at 10:03 am
That’s what I’m doing. Maybe DEvans has done that, I don’t know – haven’t opened his model yet. My hunch about his model – probably not fair because I didn’t read the whole thing – is that the notch temperature filter happens because solar flux diminishes near solar minimum. The notches look like they’re from the cycle minimums, when flux is less than – my model says 85-90 – and temps head downhill until solar activity picks up on the next cycle upswing, and then crosses the threshold going up, driving temps up again.
Ren thanks for the graph. It shows quite clearly a major change took place in year 2005 in the AP index and is still continuing today. This should continue going forward and will either make our case for solar /climate connections or break our case. I am quite confident it will make our case.
As Ren alluded to the warm oceans are creating a lag time but this should diminish as we move forward in response to prolonged minimum solar activity. Ocean Heat Content correlating quite well to the strength of solar visible and UV light bands, not IR.
I have many other studies which show this to be fact which is one of the parts of my solar/climate connections.
Quite right. Seismic activity is NOT independent of solar activity:
NASA:Volcanic eruptions and solar activity
ABSTRACT
The historical record of large volcanic eruptions from 1500 to 1980, as contained in two recent catalogs, is subjected to detailed time series analysis. Two weak, but probably statistically significant, periodicities of ~11 and ~80 years are detected. Both cycles appear to correlate with well-known cycles of solar activity; the phasing is such that the frequency of volcanic eruptions increases (decreases) slightly around the times of solar minimum (maximum). The weak quasi-biennial solar cycle is not obviously seen in the eruption data, nor are the two slow lunar tidal cycles of 8.85 and 18.6 years. Time series analysis of the volcanogenic acidities in a deep ice core from Greenland, covering the years 553-1972, reveals several very long periods ranging from ~80 to ~350 years and are similar to the very slow solar cycles previously detected in auroral and carbon 14 records. Solar flares are believed to cause changes in atmospheric circulation patterns that abruptly alter the earth’s spin. The resulting jolt probably triggers small earthquakes which may temporarily relieve some of the stress in volcanic magma chambers, thereby weakening, postponing, or even aborting imminent large eruptions. In addition, decreased atmospheric precipitation around the years of solar maximum may cause a relative deficit of phreatomagmatic eruptions at those times.
Possible correlation between solar and volcanic activity in a long-term scale
ABSTRACT
Lsvalgarrd says, “Why didn’t he design it like I would have?”
I think people should separate out two things when discussing this:
1. Irrespective of mechanism which could mediate the notch filter, is the analysis of the primary data accurate, appropriate and worthwhile i.e. is this notch filter with a delay of around 10 – 11 years believable?
2. Are there any credible mechanisms by which such a notch filter could be delivered, if it is believable??
What I write, sorry. The last such decline in magnetic activity (Ap) was in the 1900’s., I, stupid, I do not remember?
http://www.leif.org/research/Ap-1844-now.png
It is likely that the model cannot be tested against the temperature data because it is all”adjusted” or basically C###. Maybe try RAW CET the only probably trustworthy surface temp data (if that hasn’t been tampered with!)
Archibald has been predicting temps dropping “right about now” for at least the last 6-7 years. I’ll believe it when I see it.
Well, David has made good on his stated intentions and published a full and transparent package with all of the applicable code and data.
I had significant misgivings about the drip-feed approach that Jo and David decided to employ for this publication and said so at the time. I received a courteous reply from Jo and was largely persuaded that is was a good faith effort to promote discussion of various elements versus slapping down one Big Kahuna.
Authors do have the right to make decisions about the manner and timing of their publications. I would have preferred to see a different one but I do think their choice was a reasonable one. It did impose a short (2-3 week) delay on the release of fully transparent code and data. But that absolutely does not rise to the level of indecency demonstrated by Mann, Jones et al who have actively sought to prevent the release of their (publicly funded) data.
In any event, publication is now “complete”. I don’t really support the chosen method but I don’t feel it was improper or in any way deceitful. I hope Willis will be able to reach a similar conclusion.
It is nice to see a verifiable/falsifiable prediction. Right or wrong, they’re doing science here.
Steele:
I gather from what I’ve been able to glean from reading the material at the model’s Website, the claims of Evans’s model are not falsifiable. Are you confusing a capacity for being in error on the global temperature with falsifiability? Many people do that.
A problem for many is that this model doesn’t attempt to explain causes and effects; it is an analog. As an analog model, I can accept it; it’s predictive capacity is what would validate it. Even if validated, it would still just remain an analog; it cannot be proved. The causes and effects would still be unknown.
As Leif says:
As we all know ‘with five parameters I can make the elephant wiggle his trunk’
here is a model (well just a set of added waves of different amplitudes and different frequencies:
5 waves gives a smooth rather nice fit
http://bit.ly/VFFlDA
28 waves and every wiggle matched
http://bit.ly/1mvUb9w
I’ve been playing with this a bit this morning and I’m impressed with the amount of work that has gone into it and I also found the nuke model hilarious (wrt the SkS Hiroshima Bomb app).
Thank you Dr. Evans for putting this together.
That being said, I can’t say I’m hopeful the model is useful with default conditions/settings at predicting GAST. The solar influence is a tad high and the CO2 influence a tad small. Agree that Dr. Evans himself says this. The problem is if one tunes (trains, calibrates, whatever you want to call it) to match any of the major datasets then it’s going to be bunkum since all the major datasets fail to show anything in the data that could have possibly caused the “coming ice age” scare. We know it happened and I find it hard to believe scientists from the 70’s couldn’t differentiate between a global decline in temperature and a global warming hiatus (even if they couldn’t quantify it to tenths or hundredths of a degree like they can today /sarc). However you slice it, the GAST record from any of the major data sets is bunkum. Thanks to Dr. Evans putting this in an easily tunable platform we can do some adjustments and see what we get.
I’m already having fun playing with the settings.
http://wattsupwiththat.com/2013/09/25/unwarranted-temperature-adjustments-and-al-gores-unwarranted-call-for-intellectual-tyranny/
” lsvalgaard says:
July 8, 2014 at 9:56 am
”
” lsvalgaard says:
July 8, 2014 at 9:56 am
”
So you don’t like the way it looks. One thing is certain, as a model it appears to be predicting a little more closely as to what is happening in what we call climate. Okay, Why not stop complaining and put up YOUR model that attempts to predict the future climate? I assume you DO have one, correct?
Tom O says:
July 8, 2014 at 11:12 am
I assume you DO have one, correct?
Living in California my prediction is that the weather/climate in the future will be just like today. i’ll be right 97% of the time 🙂
hmm- new ideas always get mocked, of course some will be right , some not. let’s wait and see.
2011 (Reuters) – An Israeli scientist who suffered years of ridicule and even lost a research post for claiming to have found an entirely new class of solid material was awarded the Nobel Prize for chemistry on Wednesday for his discovery of quasicrystals.
Skeptics being skeptical of skepticism. No better way to get to the truth.
Yes, a good comment on the ‘utility’ of consensus’.
This model has one advantage over the IPCC models, true or false will be soon, not 2100.
Robert of Ottawa: your problem with the model not explaining the causes is what seems to be tripping a lot of people up.
I don’t think that David EVER suggested that that this model was going to answer every question, or that it would match every physical process in the climate. What he is doing is science as it should be done, and using a model as it should be used.
He started from observations; looking at how changes in TSI and global temperature were related, because he thought that there should be a relationship. He found that using a back-box model, he could get a first approximation fit of the two data sets.
He chose to look at the frequency domain. The ~11 year cyclic component of TSI doesn’t appear in the temperature variation data. From a frequency domain POV, this implies either a (very) low pass or a notch filter (or, as some AGW “scientists seem to want to claim, an open circuit — no relationship at all between the sun and temperatures on earth).
He chose to go with the notch filter, mainly because (as I understand it) that implied a delay, and several independent studies have noted a ~11 year delay.
The model was elaborated from there.
All he is trying to do is build a black-box model. he is specifically NOT trying to do what the IPCC do and model specific processes. He is starting from the other end. Looking purely at the two data sets, he is trying to answer the question “how do we get from there to here?”.
Once the model appears to track, only THEN do we start to look at mapping the model components to physical causes, or groups of related physical causes.
The filter may turn out not to be a notch – it could just as easily be a (very) low pass filter, and our friend Willis has already described a very active negative feedback system which would achieve that effect.
But that is stage two. He asked a question, and had proposed a model that might be an answer to that question. The scientific method is not to jump directly into physical processes, but to look at the model and see if it can be improved or discredited, it doesn’t claim, as other models do, to be the ultimate and definitive answer in itself. Once the model seems to be tracking reality, only THEN can we begin the next step in the analysis, to see if there are any physical processes which could account for the various model components. If so, good. If not, then maybe we have to backtrack and look at some of the alternatives.
Science is an iterative process. If you don’t see any real iteration going on, its not science.
——-
This is, of course, just my view.
David and Joanna may not agree.
Philip