Scafetta on his latest paper: Harmonic climate model versus the IPCC general circulation climate models

Guest Post by Dr. Nicola Scafetta

figure

Herein, I would like to briefly present my latest publication that continues my research about the meaning of natural climatic cycles and their implication for climate changes:

Nicola Scafetta, “Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models” Journal of Atmospheric and Solar-Terrestrial Physics (2011).

http://www.sciencedirect.com/science/article/pii/S1364682611003385

Also, a booklet version (PDF) with this comment is here

The main results of this new paper are summarized in the paper’s highlights:

1) The IPCC (CMIP3) climate models fail in reproducing observed decadal and multidecadal limate cycles. 2) Equivalent cycles are found among the major oscillations of the solar system.

3) A correction for the projected anthropogenic warming for the 21st century is proposed.

4) A full empirical model is developed for forecasting climate change for a few decades since 2000.

5) The climate will likely stay steady until 2030/2040 and may warm by about 0.3-1.2 °C by 2100.

About our climate, is the science really settled, as nobody really believes but too many have said, and already implemented in computer climate models, the so-called general circulation models (GCMs)? Can we really trust the GCM projections for the 21st century?

These projections, summarized, by the IPCC in 2007, predict a significant warming of the planet unless drastic decisions about greenhouse gases emissions are taken, and perhaps it is already too late to fix the problem, people have being also told.

However, the scientific method requires that a physical model fulfills two simple conditions: it has to reconstruct and predict (or forecast) physical observations. Thus, it is perfectly legitimate in science to check whether the computer GCMs adopted by the IPCC fulfill the required scientific tests, that is whether these models reconstruct sufficiently well the 20th century global surface temperature and, consequently, whether these models can be truly trusted in their 21st century projections. If the answer is negative, it is perfectly legitimate to look for the missing mechanisms and/or for alternative methodologies.

One of the greatest difficulties in climate science, as I see it, is in the fact that we cannot test the reliability of a climate theory or computer model by controlled lab experiments, nor we can study other planets’ climate for comparison. How it could be easy to quantify the anthropogenic effect on climate if we could simply observe the climate on another planet identical to the Earth in everything by without humans! But we do not have this luxury.

Unfortunately, we can only test a climate theory or computer model against the available data, and when these data refer to a complex system, it is well known that an even apparently minor discrepancy between a model outcome and the data may reveal major physical problems.

In some of my previous papers, for example,

Scafetta (2011): http://www.sciencedirect.com/science/article/pii/S1364682611002872

Scafetta (2010): http://www.sciencedirect.com/science/article/pii/S1364682610001495

Loehle & Scafetta (2011): http://www.benthamscience.com/open/toascj/articles/V005/74TOASCJ.htm Mazzarella & Scafetta (2011): http://www.springerlink.com/content/f637064p57n45023/

we have argued that the global instrumental surface temperature records, which are available since 1850 with some confidence, suggest that the climate system is resonating and/or synchronized to numerous astronomical oscillations found in the solar activity, in the heliospheric oscillations due to planetary movements and in the lunar cycles.

The most prominent cycles that can be detected in the global surface temperature records have periods of about 9.1 year, 10-11 years, about 20 year and about 60 years. The 9.1 year cycle appears to be linked to a Soli/Lunar tidal cycles, as I also show in the paper, while the other three cycles appear to be solar/planetary cycles ultimately related to the orbits of Jupiter and Saturn. Other cycles, at all time scales, are present but ignored in the present paper.

The above four major periodicities can be easily detected in the temperature records with alternative power spectrum analysis methodologies, as the figure below shows:

Figure1new

figure 1

Similar decadal and multidecadal cycles have being observed in numerous climatic proxy models for centuries and millennia, as documented in the references of my papers, although the proxy models need to be studied with great care because of the large divergence from the temperature they may present.

The bottom figure highlights the existence of a 60-year cycle in the temperature (red) which becomes clearly visible once the warming trend is detrended from the data and the fast fluctuations are filtered out. The black curves are obtained with harmonic models at the decadal and multidecadal scale calibrated on two non-overlapping periods: 1850-1950 and 1950-2010, so that they can validate each other.

Although the chain of the actual physical mechanisms generating these cycles is still obscure, (I have argued in my previous papers that the available climatic data would suggest an astronomical modulation of the cloud cover that would induce small oscillations in the albedo that, consequently, would cause oscillations in the surface temperature also by modulating ocean oscillations), the detected cycles can surely be considered from a purely geometrical point of view as a description of the dynamical evolution of the climate system.

Evidently, the harmonic components of the climate dynamics can be empirically modeled without any detailed knowledge of the underlying physics in the same way as the ocean tides are currently reconstructed and predicted by means of simple harmonic constituents, as Lord Kelvin realized in the 19th century. Readers should realize that Kelvin’s tidal harmonic model is likely the only geophysical model that has been proven to have good predicting capabilities and has being implemented in tidal-predicting machines: for details see

http://en.wikipedia.org/wiki/Theory_of_tides#Harmonic_analysis

In my paper I implement the same Kelvin’s philosophical approach in two ways:

  1. by checking whether the GCMs adopted by the IPCC geometrically reproduce the detected global surface temperature cycles;
  2. and by checking whether a harmonic model may be proposed to forecast climate changes. A comparison between the two methodologies is also added in the paper.

I studied all available climate model simulations for the 20th century collected by the Program for Climate

Model Diagnosis and Intercomparison (PCMDI) mostly during the years 2005 and 2006, and this archived data constitutes phase 3 of the Coupled Model Intercomparison Project (CMIP3). That can be downloaded from http://climexp.knmi.nl/selectfield_co2.cgi?

The paper contains a large supplement file with pictures of all GCM runs and their comparison with the global surface temperature for example given by the Climatic Research Unit (HadCRUT3). I strongly invite people to give a look at the numerous figures in the supplement file to have a feeling about the real performance of these models in reconstructing the observed climate, which in my opinion is quite poor at all time scales.

In the figure below I just present the HadCRUT3 record against, for example, the average simulation of the GISS ModelE for the global surface temperature from 1880 to 2003 by using all forcings, which can be downloaded from http://data.giss.nasa.gov/modelE/transient/Rc_jt.1.11.html

Figure2new

figure 2

The comparison clearly emphasizes the strong discrepancy between the model simulation and the temperature data. Qualitatively similar discrepancies are found and are typical for all GCMs adopted by the IPCC.

In fact, despite a certain warming trend is reproduced in the model, which appears to agree with the observations, the model simulation clearly fail in reproducing the cyclical dynamics of the climate that presents an evident quasi 60-year cycle with peaks around 1880, 1940 and 2000. This pattern is further stressed by the synchronized 20-year temperature cycle.

The GISS ModelE model also presents huge volcano spikes that are quite difficult to observe in the temperature record. Indeed, in the supplement file I plot the GISS ModelE signature of the volcano forcing alone against the same signature obtained with two proposed empirical models that extract the volcano signature directly from the temperature data themselves.

Figure3new

figure 3

The figure clearly shows that the GISS ModelE computer model greatly overestimates the volcano cooling signature. The same is true for the other GCMs, as shown in the supplement file of the paper. This issue is quite important, as I will explain later. In fact, thee exists an attempt to reconstruct climate variations by stressing the climatic effect of the volcano aerosol, but the lack of strong volcano spikes in the temperature record suggest that the volcano effect is already overestimated.

In any case, the paper focuses on whether the GCMs adopted by the IPCC in 2007 reproduce the cyclical modulations observed in the temperature records. With a simple regression model based on the four cycles (about 9.1, 10, 20 and 60 year period) plus an upward trend, that can be geometrically captured by a quadratic fit of the temperature, in the paper I have proved that all GCMs adopted by the IPCC fail to geometrically reproduce the detected temperature cycles at both decadal and multidecadal scale.

Figure4new

figure 4

For example, the above figure depicts the regression model coefficients “a” (for the 60-year cycle) and “b” (for the 20 year cycle) as estimated for all IPCC GCMs runs which are simply numbered in the abscissa of the figure. Values of “a” and “b” close to 1 would indicate that the model simulation well reproduces the correspondent temperature cycle. As it is evident in the figure (and in the tables reported in the paper) all models fail the test quite macroscopically.

The conclusion is evident, simple and straightforward: all GCMs adopted by the IPCC fail in correctly reproducing the decadal and multidecadal dynamical modulation observed in the global surface temperature record, thus they do not reproduce the observed dynamics of the climate. Evidently, the “science is settled” claim is false. Indeed, the models are missing important physical mechanisms driving climate changes, which may also be still quite mysterious and which I believe to ultimately be astronomical induced, as better explained in my other papers.

But now, what can we do with this physical information?

It is important to realize that the “science is settled” claim is a necessary prerequisite for efficiently engineering any physical system with an analytical computer model, as the GCMs want to do for the climate system. If the science is not settled, however, such an engineering task is not efficient and theoretically impossible. For example, an engineer can not build a functional electric devise (a phone or a radio or a TV or a computer), or a bridge or an airplane if some of the necessary physical mechanisms were unknown. Engineering does not really work with a partial science, usually. In medicine, for example, nobody claims to cure people by using some kind of physiological GCM! And GCM computer modelers are essentially climate computer engineers more than climate scientists.

In theoretical science, however, people can attempt to overcome the above problem by using a different kind of models, the empirical/phenomenological ones., which have their own limits, but also numerous advantages. There is just the need to appropriately extracting and using the information contained in the data themselves to model the observed dynamics.

Well, in the paper I used the geometrical information deduced from the temperature data to do two things:

  1. I propose a correction of the proposed net anthropogenic warming effect on the climate
  2. I implement the above net anthropogenic warming effect in the harmonic model to produce an approximate forecast for the 21st century global surface temperature by assuming the same IPCC emission projections.

To solve the first point we need to adopt a subtle reasoning. In fact, it is not possible to directly solving the natural versus the anthropogenic component of the upward warming trend observed in the climate since 1850 (about 0.8 °C) by using the harmonic model calibrated on the same data because with 161 years of data at most a 60-year cycle can be well detected, but not longer cycles.

Indeed, what numerous papers have shown, including some of mine, for example

http://www.sciencedirect.com/science/article/pii/S1364682609002089 , is that this 1850-2010 upward warming trend can be part of a multi-secular/millenarian natural cycle, which was also responsible for the Roman warming period, the Dark Ages, the Medieval Warm Period and the Little Ica Age.

The following figure from Hulum et al. (2011), http://www.sciencedirect.com/science/article/pii/S0921818111001457 ,

Figure5new

figure 5

gives an idea of how these multi-secular/millenarian natural cycles may appear by attempting a reconstruction of a pluri-millennial record proxy model for the temperature in central Greenland.

However, an accurate modeling of the multi-secular/millenarian natural cycles is not currently possible. The frequencies, amplitudes and phases are not known with great precision because the proxy models of the temperature look quite different from each other. Essentially, for our study, we want only to use the real temperature data and these data start in 1850, which evidently is a too short record for extracting multi-secular/millenarian natural cycles.

To proceed I have adopted a strategy based on the 60-year cycle, which has been estimate to have amplitude of about 0.3 °C, as the first figure above shows.

To understand the reasoning a good start is the IPCC’s figures 9.5a and 9.5b which are particularly popular among the anthropogenic global warming (AGW) advocates: http://www.ipcc.ch/publications_and_data/ar4/wg1/en/figure-9-5.html

These two figures are reproduced below:

Figure6new

figure 6

The above figure b shows that without anthropogenic forcing, according to the IPCC, the climate had to cool from 1970 to 2000 by about 0.0-0.2 °C because of volcano activity. Only the addition of anthropogenic forcings (see figure a) could have produced the 0.5 °C warming observed from 1970 to 2000. Thus, from 1970 to 2000 anthropogenic forcings are claimed to have produced a warming of about 0.5-0.7 °C in 30 years. This warming is then extended in the IPCC GCMs’ projections for the 21st century with an anthropogenic warming trend of about 2.3 °C/century, as evident in the IPCC’s figure SPM5 shown below

Figure7new

figure 7

But our trust about this IPCC’s estimate of the anthropogenic warming effect is directly challenged by the failure of these GCMs in reproducing the 60-year natural modulation which is responsible for at least about 0.3 °C of warming from 1970 to 2000. Consequently, by taking into account this natural variability the net anthropogenic warming effect should not be above 0.2-0.4 °C from 1970 to 2000, instead of the IPCC claimed 0.5-0.7 °C.

This implies that the net anthropogenic warming effect must be reduced to a maximum within a range of 0.5-1.3 °C/century since 1970 to about 2050 by taking into account the same IPCC emission projections, as argued in the paper. In the paper this result is reached by taking also into account several possibilities including the fact that the volcano cooling is evidently overestimated in the GCMs, as we have seen above, and that part of the leftover warming from 1970 to 2000 could have still be due to other factors such as urban heat island and land use change.

At this point it is possible to attempt a full forecast of the climate since 2000 that is made of the four detected decadal and multidecadal cycles plus the corrected anthropogenic warming effect trending. The results are depicted in the figures below

Figure8new

figure 8

The figure shows a full climate forecast of my proposed empirical model, against the IPCC projections since 2000. It is evident that my proposed model agrees with the data much better than the IPCC projections, as also other tests present in the paper show.

My proposed model shows two curves: one is calibrated during the period 1850-1950 and the other is calibrated during the period 1950-2010. It is evident that the two curves equally well reconstruct the climate variability from 1850 to 2011 at the decadal /multidecadal scales, as the gray temperature smooth curve highlights, with an average error of just 0.05 °C.

The propose empirical model would suggest that the same IPCC projected anthropogenic emissions would imply a global warming by about 0.3–1.2 °C by 2100, in opposition to the IPCC 1.0–3.6 °C projected warming. My proposed estimate also excludes an additional possible cooling that may derived from the multi-secular/millennial cycle.

Some implicit evident consequences of this finding is that, for example, the ocean may rise quite less, let us say a third (about 5 inches/12.5 cm) by 2100, than what projected by the IPCC, and that we probably do not need to destroy our economy for attempting to reduce CO2 emissions.

Will my forecast curve work, hopefully, for at least a few decades? Well, my model is not a “oracle crystal ball”. As it happens for the ocean tides, numerous other natural cycles may be present in the climate system at all time scales and may produce interesting interference patterns and a complex dynamics. Other nonlinear factors may be present as well, and sudden events such as volcano eruptions can always disrupt the dynamical pattern for a while. So, the model can be surely improved.

Perhaps, the model I proposed in just another illusion, we do not know yet for sure. What can be done is to continue and improve our research and possibly add month after month new temperature dots to the graph to see how the proposed forecast performs, as depicted in the figure below:

Figure9new

figure 9

The above figure shows an updated graph than what published in the paper, where the temperature record in red stops in Oct/2011. The figure adds the Nov/2011 temperature value in blue color. The monthly temperature data are from http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt

The empirical curve forecast (black curve made of the harmonic component plus the proposed corrected anthropogenic warming trend) looks in good agreement with the data up to now. Ok, it is just one month somebody may say, but indeed the depicted forecasting model started in Jan/2000!

By comparison, the figure shows in yellow the harmonic component alone made of the four cycles, which may be interpreted as a lower boundary for the natural variability, based on the same four cycles.

figure 10

In conclusion the empirical model proposed in the current paper is surely a simplified model that probably can be improved, but it already appears to greatly outperform all current GCMs adopted by the IPCC, such as the GISS ModelE. All of them fail in reconstructing the decadal and multidecadal cycles observed in the temperature records and have failed to properly forecast the steady global surface temperature observed since 2001.

It is evident that a climate model would be useful for any civil strategic purpose only if it is proved capable of predicting the climate evolution at least at a decadal/multidecadal scale. The traditional GCMs have failed up to now this goal, as shown in the paper.

The attempts of some of current climate modelers to explain and solve the failure of their GCMs in properly forecasting the approximate steady climate of the last 10 years are very unsatisfactory for any practical and theoretical purpose. In fact, some of the proposed solutions are: 1) a presumed underestimation of small volcano eruption cooling effects [Solomon et al., Science 2011] (while the GCM volcano effect is already evidently overestimated!), or 2) a hypothetical Chinese aerosol emission [Kaufmann et al., PNAS 2011](which, however, was likely decreasing since 2005!), or 3) a 10-year “red noise” unpredictable fluctuation of the climate system driven by an ocean heat content fluctuation [Meehl et al, NCC 2011] (that, however, in the model simulations would occur in 2055 and 2075!).

Apparently, these GCMs can “forecast” climate change only “a posteriori”, that is, for example, if we want to know what may happen with these GCMs from 2012 to 2020 we need first to wait the 2020 and then adjust the GCM model with ad-hoc physical explanations including even an appeal to an unpredictable “red-noise” fluctuation of the ocean heat content and flux system (occurring in the model in 2055 and 2075!) to attempt to explain the data during surface temperature hiatus periods that contradict the projected anthropogenic GHG warming!

Indeed, if this is the situation it is really impossible to forecast climate change for at least a few decades and the practical usefulness of these kind of GCMs is quite limited and potentially very misleading because the model can project a 10-year warming while then the “red-noise” dynamics of the climate system changes completely the projected pattern!

The fact is that the above ad-hoc explanations appear to be in conflict with dynamics of the climate system as evident since 1850. Indeed, this dynamics suggests a major multiple harmonic influence component on the climate with a likely astronomical origin (sun + moon + planets) although not yet fully understood in its physical mechanisms, that, as shown in the above figures, can apparently explain also the post 2000 climate quite satisfactorily (even by using my model calibrated from 1850 to 1950, that is more than 50 years before the observed temperature hiatus period since 2000!).

Perhaps, a new kind of climate models based, at least in part, on empirical reconstruction of the climate constructed on empirically detected natural cycles may indeed perform better, may have better predicting capabilities and, consequently, may be found to be more beneficial to the society than the current GCMs adopted by the IPCC.

So, is a kind of Copernican Revolution needed in climate change research, as Alan Carlin has also suggested? http://www.carlineconomics.com/archives/1456

I personally believe that there is an urgent necessity of investing more funding in scientific methodologies alternative to the traditional GCM approach and, in general, to invest more in pure climate science research than just in climate GCM engineering research as done until now on the false claim that there is no need in investing in pure science because the “science is already settled”.

About the other common AGW slogan according to which the current mainstream AGW climate science cannot be challenged because it has been based on the so-called “scientific consensus,” I would strongly suggest the reading of this post by Kevin Rice at the blog Catholibertarian entitled “On the dangerous naivety of uncritical acceptance of the scientific consensus”

http://catholibertarian.com/2011/12/30/on-the-dangerous-naivete-of-uncritical-acceptance-of-the-scientific-consensus/

It is a very educational and open-mind reading, in my opinion.

Nicola Scafetta, “Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models” Journal of Atmospheric and Solar-Terrestrial Physics (2011).

http://www.sciencedirect.com/science/article/pii/S1364682611003385

http://scienceandpublicpolicy.org/reprint/astronomical_harmonics_testing.html

Abstract:

We compare the performance of a recently proposed empirical climate model based on astronomical harmonics against all CMIP3 available general circulation climate models (GCM) used by the IPCC (2007) to interpret the 20th century global surface temperature. The proposed astronomical empirical climate model assumes that the climate is resonating with, or synchronized to a set of natural harmonics that, in previous works (Scafetta, 2010b, 2011b), have been associated to the solar system planetary motion, which is mostly determined by Jupiter and Saturn. We show that the GCMs fail to reproduce the major decadal and multidecadal oscillations found in the global surface temperature record from 1850 to 2011. On the contrary, the proposed harmonic model (which herein uses cycles with 9.1, 10–10.5, 20–21, 60–62 year periods) is found to well reconstruct the observed climate oscillations from 1850 to 2011, and it is shown to be able to forecast the climate oscillations from 1950 to 2011 using the data covering the period 1850–1950, and vice versa. The 9.1-year cycle is shown to be likely related to a decadal Soli/Lunar tidal oscillation, while the 10–10.5, 20–21 and 60–62 year cycles are synchronous to solar and heliospheric planetary oscillations. We show that the IPCC GCM’s claim that all warming observed from 1970 to 2000 has been anthropogenically induced is erroneous because of the GCM failure in reconstructing the quasi 20-year and 60-year climatic cycles. Finally, we show how the presence of these large natural cycles can be used to correct the IPCC projected anthropogenic warming trend for the 21st century. By combining this corrected trend with the natural cycles, we show that the temperature may not significantly increase during the next 30 years mostly because of the negative phase of the 60-year cycle. If multisecular natural cycles (which according to some authors have significantly contributed to the observed 1700–2010 warming and may contribute to an additional natural cooling by 2100) are ignored, the same IPCC projected anthropogenic emissions would imply a global warming by about 0.3–1.2 °C by 2100, contrary to the IPCC 1.0–3.6 °C projected warming. The results of this paper reinforce previous claims that the relevant physical mechanisms that explain the detected climatic cycles are still missing in the current GCMs and that climate variations at the multidecadal scales are astronomically induced and, in first approximation, can be forecast.

Get notified when a new post is published.
Subscribe today!
5 1 vote
Article Rating
119 Comments
Inline Feedbacks
View all comments
BarryW
January 9, 2012 2:31 pm

Dr. Scafetta, could you please provide a dowmload site/source to retrieve the data points from the output of your calculation? I and I’m sure others would like to look at the results in more detail.

January 9, 2012 2:49 pm

M.A.Vukcevic says:
I am sorry, but I have a very strong impression that you never look at the references, nor you read my papers. You continuously simply repeat again and again your point referencing the CET record which is a complex records and it is not appropriate for the analysis without a deailed study.
Ok, let us try again,let us talk about only one case for simplicity. Look here
http://www.vliz.be/imisdocs/publications/218039.pdf
look figure 5.
Describe me exctly what that figure shows, so we know that you gave a look at it. Note that figure is made of multiple figures. I am asking you to describe with precision what each of those figures show.
If you give me the impression that you are looking at the figures, we can try to analyze another paper.
BarryW
Model Diagnosis and Intercomparison (PCMDI) mostly during the years 2005 and 2006, and this archived data constitutes phase 3 of the Coupled Model Intercomparison Project (CMIP3). That can be downloaded from
http://climexp.knmi.nl/selectfield_co2.cgi?
The monthly temperature data are from http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt

January 9, 2012 3:27 pm

Re. Fig 5.
It is not exactly clear what you whish to prove with it, there is a hotchpotch of periods there; proxy dating of any kind (excluding natural annual growth e.g. tree rings or coral) has large margins (for 2k years back could be as +- 25 or more years due to particle diffusion process).
On basis of Fig. 5, in the industry I spend some decades, you wouldn’t get a look in, not to mention a budget for evaluating, let alone construction of a project.
The blue circle denotes the spectral peak for the instrumental AMO record, false assertion; the AMO was discovered in 1990s and reconstructed back from incomplete sea surface temperatures back to 1950s. Only numbers after 1970 can be considered to be reliable. As for prior to 1950 I wouldn’t put my shirt on it, relevant papers:
Corrections to Pre-1941 SST Measurements for Studies of Long-Term Changes in SSTs, Jones et al
http://icoads.noaa.gov/Boulder/Boulder.Jones.pdf
Assessing bias corrections in historical sea surface temperature using a climate model, Folland
ftp://ftp.wmo.int/Documents/PublicWeb/amp/mmop/documents/JCOMM-TR/J-TR-13-Marine-Climatology/REV1/joc1171.pdf
Reassessing biases and other uncertainties in sea surface temperature observations, Kennedy et al
ftp://ftp.astr.ucl.ac.be/publi/2011_08_03-08h24-hugues.goosse-14.pdf

Theo Goodwin
January 9, 2012 3:32 pm

Nicola Scafetta says:
January 9, 2012 at 1:27 pm
Theo Goodwin says: “…..”
If I will be able to understand what you are trying to say, I too will also try to give a response.”
You begin with the following:
“However, the scientific method requires that a physical model fulfills two simple conditions: it has to reconstruct and predict (or forecast) physical observations. Thus, it is perfectly legitimate in science to check whether the computer GCMs adopted by the IPCC fulfill the required scientific tests, that is whether these models reconstruct sufficiently well the 20th century global surface temperature and, consequently, whether these models can be truly trusted in their 21st century projections. If the answer is negative, it is perfectly legitimate to look for the missing mechanisms and/or for alternative methodologies.”
Models cannot be used to predict anything. You, like the vast majority of climate scientists today, show that you are ignorant of the differences between models and theories. Theories can be used for prediction but models serve only to reproduce (reconstruct, in your terms) reality. Now, how can you predict anything from a reconstruction of reality? To get right to the point without taking the time to be polite, and I beg your pardon, what you are doing is extrapolating lines on past graphs into lines on graphs about the future. That is not a recipe for science, though it is not without value.
My time is extremely limited so I have copied here some comments of mine from Andrew Montford’s website. I hope they clarify for you the differences between models and theories. Also, I hope that they make clear that the well confirmed physical hypotheses that make up theories are essential to science while models are really wonderful analytic tools but secondary to the scientific enterprise. I recommend that you read the entire post (not by me) and comments at Montford’s website.
http://bishophill.squarespace.com/blog/2012/1/4/conveying-truth.html?currentPage=3#comments
“Thanks for your question. A little terminology will clarify the matter. Newton’s theory is a physical theory and its several hypotheses are well confirmed as it applies to our solar system. Of course it does not provide the detail or reach of Einstein’s theory but it does just fine in our solar system.
Our solar system is a model of Newton’s equations. A model is a set of objects that renders true all the individual statements in a physical theory. One can construct computer models of Newton’s theory. Some company sells an “observatory” that will project our solar system on the semicircular ceiling that you have constructed just for this purpose and it will predict and postdict planetary movement and such. A model that does prediction and postdiction can exist because of Newton’s theory. In other words, the programmers actually used Newton’s equations to calculate where all the shiny little dots should appear on the ceiling in the future or past as you dial-up one time or another.
The climate scientists who are creating GCMs, models of Earth’s climate, have no set of physical hypotheses that play the role of Newton’s equations in our little observatory. All they have are Arrhenius’s equations and a lot of data about climate. Arrhenius’ equations have never been rigorously formulated for the actual Earth. So they are not well confirmed in Earth’s atmosphere. No less important, everyone knows, as Arrhenius knew, that Arrhenius’ equations are not enough to explain or predict Earth’s climate. In addition, you need the physical hypotheses that govern all the so-called “feedbacks” such as cloud behavior. These physical hypotheses do not exist in any form that could be considered well confirmed. Much empirical research must be done before they can exist.
As for the data, models contain wonderful differential equations that manipulate the data in wonderful ways; however, all of that data manipulation is nothing more than a sophisticated method of extrapolating the future from existing graphs. That is not science. That is a system of hunches.
I hope you now understand the difference between theories and models. Theories describe the natural regularities that make up nature while models reproduce the objects or events that are nature. It is not possible to make predictions from a model. If you had the perfect model of Earth’s climate all you would have is Earth’s climate. How can you make predictions from that?”
Jan 5, 2012 at 6:59 PM | Theo Goodwin
What follows contains a more complete description of physical hypotheses and their absolutely essential role at the heart of science and scientific method. It is also from Montford’s blog.
“You have failed to grasp the difference between physical hypotheses and models. Physical hypotheses describe natural regularities. The key word here is “describe.” Physical hypotheses are about some aspect of physical reality and the true or really well confirmed physical hypotheses (which make up mature theories, eventually) tell us what that reality is.The key word here is “about.” The physical hypotheses are creations of intellect that stand apart from reality and tell us about reality, tell us what reality is.
By contrast, models produce simulations that reproduce some salient features of reality. Simulations do not describe reality and are not about reality and, for those reasons, simulations are neither true nor false. Simulations are complete or not. They give an exact reproduction of reality or they fail as simulations to some degree. Simulations are not creations of intellect; rather, the computer code that generates them is a creation of intellect. However, the computer code does not describe reality and is not about reality. In sum, the value of a simulation depends entirely on whether and to what degree it reproduces reality. Why would you think that a reproduction of reality can be used to predict reality?
Physical hypotheses bear an important logical relationship to the reality that they describe. When combined with statements of initial conditions specifying observable fact, they logically imply observation sentences about future events. These observation sentences are what logicians call “instances” of the natural regularities described by the physical hypotheses. A record of predictions found true make physical hypotheses well confirmed and make for them a place in science. Note the centrality of “natural regularities.” The purpose of science is to discover the natural regularities that comprise nature.
By contrast, can you specify some logical relationship that exists between reality and a model and its simulations? You cannot because there is none. That is why the usefulness of models in science is limited to analytical work such as discovering hidden assumptions.
Models cannot substitute for well confirmed physical hypotheses. The point is not based on temporary or practical considerations but on the very logic of the two structures.”
Jan 6, 2012 at 4:40 AM | Theo Goodwin

Michael Reed
January 9, 2012 3:36 pm

Typo alert! “Guest Post by Dr, Nicola Scafetta” should have a period after the “Dr,” not a comma. One of the few useful contributions my liberal arts education permits me on this science heavy forum.
[Your education paid dividends. Typo fixed. ~dbs, mod.]

January 9, 2012 3:50 pm

@Theo Goodwin,
Unfortunately I do not understand how your philosophical essay fits my study.
You say : “Models cannot be used to predict anything. ”
I do not agree. In science models are always used to try to predict something. If they succeed or not and at what degree it is another thing that can be tested with analysis of the data.

January 9, 2012 3:56 pm

M.A.Vukcevic says:
“Re. Fig 5.
It is not exactly clear what you whish to prove with it, there is a hotchpotch of periods there; proxy dating of any kind (excluding natural annual growth e.g. tree rings or coral) has large margins (for 2k years back could be as +- 25 or more years due to particle diffusion process).”
You just proved me that when I ask you to look at some data you are not able to do it. So, there is no reason for me to try to explain you things.

January 9, 2012 3:59 pm

@johnnythelowery says:
January 9, 2012 at 1:48 pm
What if the Loehle reconstruction is very poor through the 7th century:
http://wattsupwiththat.com/2012/01/04/solar-cycle-update-sunspots-down-ap-index-way-down/#comment-856954

Greg Cavanagh
January 9, 2012 4:21 pm

Dr. Scafetta, Your modeling shows a continuous increasing trend to 2100.
I’m curious; If the model continues:
Does it ever reach a maximum?
Does it model the larger climate minima and maxima over a 2000 year time scale?
Can it make any predictions for the next ice age?

Jay Curtis
January 9, 2012 4:28 pm

Ugh! This is exhausting.
[Reply: snip–yes it is, barycentric effect theories are a prohibited topic on this site. ~ctm]

January 9, 2012 4:36 pm

Greg Cavanagh says: “…..”
the upward trend is based on the same emission scenarios of the IPCC. They stop in 2100 as the model. After 2100 the model scenario show a platoo. So also my model will stabilize.
However, my model may work for a few decads because on longer scales other longer cycles, not included in the model, will be important.

January 9, 2012 4:40 pm

An excellent article. I would add that Prof Claes Johnson has now proved that there can be no warming of the surface by backradiation, so the greenhouse effect is a non-event which you don’t need to allow for.
I also wondered what you think of the inverted plot of the sum of the scalar angular momentum of the Sun and 9 planets as here: http://earth-climate.com/planetcycles.jpg

Johnnythelowery
January 9, 2012 5:20 pm

Nicola: Thank you for your response. Interesting posting.

Greg Cavanagh
January 9, 2012 5:35 pm

Dr. Scafetta “However, my model may work for a few decades because on longer scales other longer cycles, not included in the model, will be important.”
My thinking exactly. The longer time scale “ups and downs” could make your 2100 estimate somewhat more inaccurate than is useful. Though your modelling shows obvious usefulness in the shorter timescales.
Personally; I like the empirical approaches to future estimates, much more than the first principle approach. I like your work.

January 9, 2012 5:38 pm

Doug Cotton says: “….”
it is an interesting graph.
The longer cycles are the most difficult to determine.
But I am trying to develop an extended model

jimmi_the_dalek
January 9, 2012 6:40 pm

On the barycenter:
From various comments, here and in other threads, it appears that some believe that the position of the barycenter matters. It does not. No real physical effect can depend upon the position of the barycenter. The reason is simple. The barycenter is not real. It is a mathematical abstraction. There is no mass at the barycenter. It is not a source of gravitational attraction, torque , magnetic fields of anything else. It is simply the origin of a particular coordinate system designed to make solving for the motions of the planets easier. No physical property can ever depend upon the choice of coordinate system, as coordinate systems are mathematical inventions, not real objects.
To show that it can have no effect I will use a reductio ad absurdum. Imagine including the next nearest star system in your equations – the Centauri system, which has three components that we know about with a combined mass more than twice that of our sun. Although it would make the maths more difficult, you could solve the equations of motion for the positions of our star, the planets and the elements of the Centauri system. And guess what – you would get exactly the same answers for the relative motions of our star and planets, despite the fact that for the combined system the barycenter would be roughly 3 light-years away.
[Reply: thanks, but even dismissals will be limited as they spawn further replies ~ ctm]

Johnnythelowery
January 9, 2012 6:41 pm

…..so, I think it’s all going to kick off on this thread. Generally, when Leif issues a rebuff: It tends to leave it’s target………well,…………..in the Buff really. I think i’ll get the beers and pop corn in for this one!

ferd berple
January 9, 2012 7:24 pm

UnfrozenCavemanMD says:
January 9, 2012 at 6:38 am
I don’t see why models that assume from the start that variation is due to internal oscillations are any more valid
Because linear relationships in Nature have long since been eliminated by time. Only cyclical relationships survive billions of years. Thus looking to model nature with as a linear functions is not going to be accurate.

ferd berple
January 9, 2012 7:42 pm

Bart says:
January 9, 2012 at 10:23 am
There could be influences from Jupiter and Saturn, but I think it is unlikely. Tidal forces from the great planets at the Earth are just tiny. More likely, IMHO, is that the correlation is coincidental.
The solar system moves in near integer harmonics. The tidal forces are tiny yet the odds of this being due to chance are fantastic. We like to think we know the secrets of the universe. We have only scratched the surface.
The force of a child on a swing is tiny, yet the result is large. However, if you only watched the child when they first got on the swing, you would never guess what was going to happen. Without having seen a swing, you would guess the child might rock back and forth an inch or two, according to their puny motions.
The missing ingredient in our view of reality is time. We fail to see this because our lives are so short. Over time, a drop of water over time will cut a mountain in half.

David A. Evans
January 9, 2012 7:55 pm

Unless I misread this, the training period was from 1850 to 1950.
Results look pretty good from that.
Physical models as I recall are trained to present day.
Modelling from incomplete physics is always going to fail, empiricism rules.
This doesn’t change my view that there is no meaningful Global average temperature.
DaveE.

ferd berple
January 9, 2012 8:05 pm

tallbloke says:
January 9, 2012 at 6:18 am
Start by funding these two guys
http://tallbloke.wordpress.com/2012/01/09/two-more-theories-relevant-to-the-unified-theory-of-climate-by-nikolov-and-zeller/
Gore made use of his time as VP to ensure only those folks on the AGW bandwagon would have a say in allocating research funds. Then he invested in AGW. A generation of science was lost as a result.

oMan
January 9, 2012 8:54 pm

CTM re the snip: My bad. I didn’t know that topic was off-limits. Having done a little more reading, I think I see why it’s a dead end.

January 9, 2012 11:38 pm

Dr. Scafetta, I take very little notice, don’t waste my time and don’t take for granted little ‘pretty pictures’ wherever they come from. I only take interest only if I can reproduce it from data myself, as many of visitors of this blog know via many graphs I originated and presented. Call it whatever you whish, many were and are happy with what Dr. Mann said about his ‘hockey stick’ graph without question.

January 10, 2012 12:48 am

Dr. Scafetta
here is something I just put up on another WUWT thread you should have a good look
http://www.vukcevic.talktalk.net/CET-SW.htm
Notice the common components at 22 years (solar magnetic -Hale cycle, btw not much if anything at 11 years) and around 70 years, but also the non-sync ones at 55 for summer and ~90 for winter, disappointingly there is nothing at 60 years. Disapproving of the longest and most accurate world temperature record (the CET) doesn’t do much for credibility.

Frank
January 10, 2012 6:36 am

Bart makes a good point when he says that you don’t need cyclic forcings to get cyclic behaviour. The 9, 10, 20 and 60 year cycles could just be natural frequencies of the complex system that is our climate and they’ll appear with any random forcing. So trying to correlate with orbits of far-away planets might actually not be relevant.
The problem is with forecasting on the basis of these 4 oscillations, is that these natural frequencies can shift and/or or change in magnitude. Figure 1 shows another cycle of approximately 14 year, which is not a lot smaller than the 10 year cycle. Maybe it was larger than the 10 year peak at some time?
It would be interesting to see how these peaks vary over time, e.g. with a wavelet transform.