Scafetta on his latest paper: Harmonic climate model versus the IPCC general circulation climate models

Guest Post by Dr. Nicola Scafetta

figure

Herein, I would like to briefly present my latest publication that continues my research about the meaning of natural climatic cycles and their implication for climate changes:

Nicola Scafetta, “Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models” Journal of Atmospheric and Solar-Terrestrial Physics (2011).

http://www.sciencedirect.com/science/article/pii/S1364682611003385

Also, a booklet version (PDF) with this comment is here

The main results of this new paper are summarized in the paper’s highlights:

1) The IPCC (CMIP3) climate models fail in reproducing observed decadal and multidecadal limate cycles. 2) Equivalent cycles are found among the major oscillations of the solar system.

3) A correction for the projected anthropogenic warming for the 21st century is proposed.

4) A full empirical model is developed for forecasting climate change for a few decades since 2000.

5) The climate will likely stay steady until 2030/2040 and may warm by about 0.3-1.2 °C by 2100.

About our climate, is the science really settled, as nobody really believes but too many have said, and already implemented in computer climate models, the so-called general circulation models (GCMs)? Can we really trust the GCM projections for the 21st century?

These projections, summarized, by the IPCC in 2007, predict a significant warming of the planet unless drastic decisions about greenhouse gases emissions are taken, and perhaps it is already too late to fix the problem, people have being also told.

However, the scientific method requires that a physical model fulfills two simple conditions: it has to reconstruct and predict (or forecast) physical observations. Thus, it is perfectly legitimate in science to check whether the computer GCMs adopted by the IPCC fulfill the required scientific tests, that is whether these models reconstruct sufficiently well the 20th century global surface temperature and, consequently, whether these models can be truly trusted in their 21st century projections. If the answer is negative, it is perfectly legitimate to look for the missing mechanisms and/or for alternative methodologies.

One of the greatest difficulties in climate science, as I see it, is in the fact that we cannot test the reliability of a climate theory or computer model by controlled lab experiments, nor we can study other planets’ climate for comparison. How it could be easy to quantify the anthropogenic effect on climate if we could simply observe the climate on another planet identical to the Earth in everything by without humans! But we do not have this luxury.

Unfortunately, we can only test a climate theory or computer model against the available data, and when these data refer to a complex system, it is well known that an even apparently minor discrepancy between a model outcome and the data may reveal major physical problems.

In some of my previous papers, for example,

Scafetta (2011): http://www.sciencedirect.com/science/article/pii/S1364682611002872

Scafetta (2010): http://www.sciencedirect.com/science/article/pii/S1364682610001495

Loehle & Scafetta (2011): http://www.benthamscience.com/open/toascj/articles/V005/74TOASCJ.htm Mazzarella & Scafetta (2011): http://www.springerlink.com/content/f637064p57n45023/

we have argued that the global instrumental surface temperature records, which are available since 1850 with some confidence, suggest that the climate system is resonating and/or synchronized to numerous astronomical oscillations found in the solar activity, in the heliospheric oscillations due to planetary movements and in the lunar cycles.

The most prominent cycles that can be detected in the global surface temperature records have periods of about 9.1 year, 10-11 years, about 20 year and about 60 years. The 9.1 year cycle appears to be linked to a Soli/Lunar tidal cycles, as I also show in the paper, while the other three cycles appear to be solar/planetary cycles ultimately related to the orbits of Jupiter and Saturn. Other cycles, at all time scales, are present but ignored in the present paper.

The above four major periodicities can be easily detected in the temperature records with alternative power spectrum analysis methodologies, as the figure below shows:

Figure1new

figure 1

Similar decadal and multidecadal cycles have being observed in numerous climatic proxy models for centuries and millennia, as documented in the references of my papers, although the proxy models need to be studied with great care because of the large divergence from the temperature they may present.

The bottom figure highlights the existence of a 60-year cycle in the temperature (red) which becomes clearly visible once the warming trend is detrended from the data and the fast fluctuations are filtered out. The black curves are obtained with harmonic models at the decadal and multidecadal scale calibrated on two non-overlapping periods: 1850-1950 and 1950-2010, so that they can validate each other.

Although the chain of the actual physical mechanisms generating these cycles is still obscure, (I have argued in my previous papers that the available climatic data would suggest an astronomical modulation of the cloud cover that would induce small oscillations in the albedo that, consequently, would cause oscillations in the surface temperature also by modulating ocean oscillations), the detected cycles can surely be considered from a purely geometrical point of view as a description of the dynamical evolution of the climate system.

Evidently, the harmonic components of the climate dynamics can be empirically modeled without any detailed knowledge of the underlying physics in the same way as the ocean tides are currently reconstructed and predicted by means of simple harmonic constituents, as Lord Kelvin realized in the 19th century. Readers should realize that Kelvin’s tidal harmonic model is likely the only geophysical model that has been proven to have good predicting capabilities and has being implemented in tidal-predicting machines: for details see

http://en.wikipedia.org/wiki/Theory_of_tides#Harmonic_analysis

In my paper I implement the same Kelvin’s philosophical approach in two ways:

  1. by checking whether the GCMs adopted by the IPCC geometrically reproduce the detected global surface temperature cycles;
  2. and by checking whether a harmonic model may be proposed to forecast climate changes. A comparison between the two methodologies is also added in the paper.

I studied all available climate model simulations for the 20th century collected by the Program for Climate

Model Diagnosis and Intercomparison (PCMDI) mostly during the years 2005 and 2006, and this archived data constitutes phase 3 of the Coupled Model Intercomparison Project (CMIP3). That can be downloaded from http://climexp.knmi.nl/selectfield_co2.cgi?

The paper contains a large supplement file with pictures of all GCM runs and their comparison with the global surface temperature for example given by the Climatic Research Unit (HadCRUT3). I strongly invite people to give a look at the numerous figures in the supplement file to have a feeling about the real performance of these models in reconstructing the observed climate, which in my opinion is quite poor at all time scales.

In the figure below I just present the HadCRUT3 record against, for example, the average simulation of the GISS ModelE for the global surface temperature from 1880 to 2003 by using all forcings, which can be downloaded from http://data.giss.nasa.gov/modelE/transient/Rc_jt.1.11.html

Figure2new

figure 2

The comparison clearly emphasizes the strong discrepancy between the model simulation and the temperature data. Qualitatively similar discrepancies are found and are typical for all GCMs adopted by the IPCC.

In fact, despite a certain warming trend is reproduced in the model, which appears to agree with the observations, the model simulation clearly fail in reproducing the cyclical dynamics of the climate that presents an evident quasi 60-year cycle with peaks around 1880, 1940 and 2000. This pattern is further stressed by the synchronized 20-year temperature cycle.

The GISS ModelE model also presents huge volcano spikes that are quite difficult to observe in the temperature record. Indeed, in the supplement file I plot the GISS ModelE signature of the volcano forcing alone against the same signature obtained with two proposed empirical models that extract the volcano signature directly from the temperature data themselves.

Figure3new

figure 3

The figure clearly shows that the GISS ModelE computer model greatly overestimates the volcano cooling signature. The same is true for the other GCMs, as shown in the supplement file of the paper. This issue is quite important, as I will explain later. In fact, thee exists an attempt to reconstruct climate variations by stressing the climatic effect of the volcano aerosol, but the lack of strong volcano spikes in the temperature record suggest that the volcano effect is already overestimated.

In any case, the paper focuses on whether the GCMs adopted by the IPCC in 2007 reproduce the cyclical modulations observed in the temperature records. With a simple regression model based on the four cycles (about 9.1, 10, 20 and 60 year period) plus an upward trend, that can be geometrically captured by a quadratic fit of the temperature, in the paper I have proved that all GCMs adopted by the IPCC fail to geometrically reproduce the detected temperature cycles at both decadal and multidecadal scale.

Figure4new

figure 4

For example, the above figure depicts the regression model coefficients “a” (for the 60-year cycle) and “b” (for the 20 year cycle) as estimated for all IPCC GCMs runs which are simply numbered in the abscissa of the figure. Values of “a” and “b” close to 1 would indicate that the model simulation well reproduces the correspondent temperature cycle. As it is evident in the figure (and in the tables reported in the paper) all models fail the test quite macroscopically.

The conclusion is evident, simple and straightforward: all GCMs adopted by the IPCC fail in correctly reproducing the decadal and multidecadal dynamical modulation observed in the global surface temperature record, thus they do not reproduce the observed dynamics of the climate. Evidently, the “science is settled” claim is false. Indeed, the models are missing important physical mechanisms driving climate changes, which may also be still quite mysterious and which I believe to ultimately be astronomical induced, as better explained in my other papers.

But now, what can we do with this physical information?

It is important to realize that the “science is settled” claim is a necessary prerequisite for efficiently engineering any physical system with an analytical computer model, as the GCMs want to do for the climate system. If the science is not settled, however, such an engineering task is not efficient and theoretically impossible. For example, an engineer can not build a functional electric devise (a phone or a radio or a TV or a computer), or a bridge or an airplane if some of the necessary physical mechanisms were unknown. Engineering does not really work with a partial science, usually. In medicine, for example, nobody claims to cure people by using some kind of physiological GCM! And GCM computer modelers are essentially climate computer engineers more than climate scientists.

In theoretical science, however, people can attempt to overcome the above problem by using a different kind of models, the empirical/phenomenological ones., which have their own limits, but also numerous advantages. There is just the need to appropriately extracting and using the information contained in the data themselves to model the observed dynamics.

Well, in the paper I used the geometrical information deduced from the temperature data to do two things:

  1. I propose a correction of the proposed net anthropogenic warming effect on the climate
  2. I implement the above net anthropogenic warming effect in the harmonic model to produce an approximate forecast for the 21st century global surface temperature by assuming the same IPCC emission projections.

To solve the first point we need to adopt a subtle reasoning. In fact, it is not possible to directly solving the natural versus the anthropogenic component of the upward warming trend observed in the climate since 1850 (about 0.8 °C) by using the harmonic model calibrated on the same data because with 161 years of data at most a 60-year cycle can be well detected, but not longer cycles.

Indeed, what numerous papers have shown, including some of mine, for example

http://www.sciencedirect.com/science/article/pii/S1364682609002089 , is that this 1850-2010 upward warming trend can be part of a multi-secular/millenarian natural cycle, which was also responsible for the Roman warming period, the Dark Ages, the Medieval Warm Period and the Little Ica Age.

The following figure from Hulum et al. (2011), http://www.sciencedirect.com/science/article/pii/S0921818111001457 ,

Figure5new

figure 5

gives an idea of how these multi-secular/millenarian natural cycles may appear by attempting a reconstruction of a pluri-millennial record proxy model for the temperature in central Greenland.

However, an accurate modeling of the multi-secular/millenarian natural cycles is not currently possible. The frequencies, amplitudes and phases are not known with great precision because the proxy models of the temperature look quite different from each other. Essentially, for our study, we want only to use the real temperature data and these data start in 1850, which evidently is a too short record for extracting multi-secular/millenarian natural cycles.

To proceed I have adopted a strategy based on the 60-year cycle, which has been estimate to have amplitude of about 0.3 °C, as the first figure above shows.

To understand the reasoning a good start is the IPCC’s figures 9.5a and 9.5b which are particularly popular among the anthropogenic global warming (AGW) advocates: http://www.ipcc.ch/publications_and_data/ar4/wg1/en/figure-9-5.html

These two figures are reproduced below:

Figure6new

figure 6

The above figure b shows that without anthropogenic forcing, according to the IPCC, the climate had to cool from 1970 to 2000 by about 0.0-0.2 °C because of volcano activity. Only the addition of anthropogenic forcings (see figure a) could have produced the 0.5 °C warming observed from 1970 to 2000. Thus, from 1970 to 2000 anthropogenic forcings are claimed to have produced a warming of about 0.5-0.7 °C in 30 years. This warming is then extended in the IPCC GCMs’ projections for the 21st century with an anthropogenic warming trend of about 2.3 °C/century, as evident in the IPCC’s figure SPM5 shown below

Figure7new

figure 7

But our trust about this IPCC’s estimate of the anthropogenic warming effect is directly challenged by the failure of these GCMs in reproducing the 60-year natural modulation which is responsible for at least about 0.3 °C of warming from 1970 to 2000. Consequently, by taking into account this natural variability the net anthropogenic warming effect should not be above 0.2-0.4 °C from 1970 to 2000, instead of the IPCC claimed 0.5-0.7 °C.

This implies that the net anthropogenic warming effect must be reduced to a maximum within a range of 0.5-1.3 °C/century since 1970 to about 2050 by taking into account the same IPCC emission projections, as argued in the paper. In the paper this result is reached by taking also into account several possibilities including the fact that the volcano cooling is evidently overestimated in the GCMs, as we have seen above, and that part of the leftover warming from 1970 to 2000 could have still be due to other factors such as urban heat island and land use change.

At this point it is possible to attempt a full forecast of the climate since 2000 that is made of the four detected decadal and multidecadal cycles plus the corrected anthropogenic warming effect trending. The results are depicted in the figures below

Figure8new

figure 8

The figure shows a full climate forecast of my proposed empirical model, against the IPCC projections since 2000. It is evident that my proposed model agrees with the data much better than the IPCC projections, as also other tests present in the paper show.

My proposed model shows two curves: one is calibrated during the period 1850-1950 and the other is calibrated during the period 1950-2010. It is evident that the two curves equally well reconstruct the climate variability from 1850 to 2011 at the decadal /multidecadal scales, as the gray temperature smooth curve highlights, with an average error of just 0.05 °C.

The propose empirical model would suggest that the same IPCC projected anthropogenic emissions would imply a global warming by about 0.3–1.2 °C by 2100, in opposition to the IPCC 1.0–3.6 °C projected warming. My proposed estimate also excludes an additional possible cooling that may derived from the multi-secular/millennial cycle.

Some implicit evident consequences of this finding is that, for example, the ocean may rise quite less, let us say a third (about 5 inches/12.5 cm) by 2100, than what projected by the IPCC, and that we probably do not need to destroy our economy for attempting to reduce CO2 emissions.

Will my forecast curve work, hopefully, for at least a few decades? Well, my model is not a “oracle crystal ball”. As it happens for the ocean tides, numerous other natural cycles may be present in the climate system at all time scales and may produce interesting interference patterns and a complex dynamics. Other nonlinear factors may be present as well, and sudden events such as volcano eruptions can always disrupt the dynamical pattern for a while. So, the model can be surely improved.

Perhaps, the model I proposed in just another illusion, we do not know yet for sure. What can be done is to continue and improve our research and possibly add month after month new temperature dots to the graph to see how the proposed forecast performs, as depicted in the figure below:

Figure9new

figure 9

The above figure shows an updated graph than what published in the paper, where the temperature record in red stops in Oct/2011. The figure adds the Nov/2011 temperature value in blue color. The monthly temperature data are from http://www.cru.uea.ac.uk/cru/data/temperature/hadcrut3gl.txt

The empirical curve forecast (black curve made of the harmonic component plus the proposed corrected anthropogenic warming trend) looks in good agreement with the data up to now. Ok, it is just one month somebody may say, but indeed the depicted forecasting model started in Jan/2000!

By comparison, the figure shows in yellow the harmonic component alone made of the four cycles, which may be interpreted as a lower boundary for the natural variability, based on the same four cycles.

figure 10

In conclusion the empirical model proposed in the current paper is surely a simplified model that probably can be improved, but it already appears to greatly outperform all current GCMs adopted by the IPCC, such as the GISS ModelE. All of them fail in reconstructing the decadal and multidecadal cycles observed in the temperature records and have failed to properly forecast the steady global surface temperature observed since 2001.

It is evident that a climate model would be useful for any civil strategic purpose only if it is proved capable of predicting the climate evolution at least at a decadal/multidecadal scale. The traditional GCMs have failed up to now this goal, as shown in the paper.

The attempts of some of current climate modelers to explain and solve the failure of their GCMs in properly forecasting the approximate steady climate of the last 10 years are very unsatisfactory for any practical and theoretical purpose. In fact, some of the proposed solutions are: 1) a presumed underestimation of small volcano eruption cooling effects [Solomon et al., Science 2011] (while the GCM volcano effect is already evidently overestimated!), or 2) a hypothetical Chinese aerosol emission [Kaufmann et al., PNAS 2011](which, however, was likely decreasing since 2005!), or 3) a 10-year “red noise” unpredictable fluctuation of the climate system driven by an ocean heat content fluctuation [Meehl et al, NCC 2011] (that, however, in the model simulations would occur in 2055 and 2075!).

Apparently, these GCMs can “forecast” climate change only “a posteriori”, that is, for example, if we want to know what may happen with these GCMs from 2012 to 2020 we need first to wait the 2020 and then adjust the GCM model with ad-hoc physical explanations including even an appeal to an unpredictable “red-noise” fluctuation of the ocean heat content and flux system (occurring in the model in 2055 and 2075!) to attempt to explain the data during surface temperature hiatus periods that contradict the projected anthropogenic GHG warming!

Indeed, if this is the situation it is really impossible to forecast climate change for at least a few decades and the practical usefulness of these kind of GCMs is quite limited and potentially very misleading because the model can project a 10-year warming while then the “red-noise” dynamics of the climate system changes completely the projected pattern!

The fact is that the above ad-hoc explanations appear to be in conflict with dynamics of the climate system as evident since 1850. Indeed, this dynamics suggests a major multiple harmonic influence component on the climate with a likely astronomical origin (sun + moon + planets) although not yet fully understood in its physical mechanisms, that, as shown in the above figures, can apparently explain also the post 2000 climate quite satisfactorily (even by using my model calibrated from 1850 to 1950, that is more than 50 years before the observed temperature hiatus period since 2000!).

Perhaps, a new kind of climate models based, at least in part, on empirical reconstruction of the climate constructed on empirically detected natural cycles may indeed perform better, may have better predicting capabilities and, consequently, may be found to be more beneficial to the society than the current GCMs adopted by the IPCC.

So, is a kind of Copernican Revolution needed in climate change research, as Alan Carlin has also suggested? http://www.carlineconomics.com/archives/1456

I personally believe that there is an urgent necessity of investing more funding in scientific methodologies alternative to the traditional GCM approach and, in general, to invest more in pure climate science research than just in climate GCM engineering research as done until now on the false claim that there is no need in investing in pure science because the “science is already settled”.

About the other common AGW slogan according to which the current mainstream AGW climate science cannot be challenged because it has been based on the so-called “scientific consensus,” I would strongly suggest the reading of this post by Kevin Rice at the blog Catholibertarian entitled “On the dangerous naivety of uncritical acceptance of the scientific consensus”

http://catholibertarian.com/2011/12/30/on-the-dangerous-naivete-of-uncritical-acceptance-of-the-scientific-consensus/

It is a very educational and open-mind reading, in my opinion.

Nicola Scafetta, “Testing an astronomically based decadal-scale empirical harmonic climate model versus the IPCC (2007) general circulation climate models” Journal of Atmospheric and Solar-Terrestrial Physics (2011).

http://www.sciencedirect.com/science/article/pii/S1364682611003385

http://scienceandpublicpolicy.org/reprint/astronomical_harmonics_testing.html

Abstract:

We compare the performance of a recently proposed empirical climate model based on astronomical harmonics against all CMIP3 available general circulation climate models (GCM) used by the IPCC (2007) to interpret the 20th century global surface temperature. The proposed astronomical empirical climate model assumes that the climate is resonating with, or synchronized to a set of natural harmonics that, in previous works (Scafetta, 2010b, 2011b), have been associated to the solar system planetary motion, which is mostly determined by Jupiter and Saturn. We show that the GCMs fail to reproduce the major decadal and multidecadal oscillations found in the global surface temperature record from 1850 to 2011. On the contrary, the proposed harmonic model (which herein uses cycles with 9.1, 10–10.5, 20–21, 60–62 year periods) is found to well reconstruct the observed climate oscillations from 1850 to 2011, and it is shown to be able to forecast the climate oscillations from 1950 to 2011 using the data covering the period 1850–1950, and vice versa. The 9.1-year cycle is shown to be likely related to a decadal Soli/Lunar tidal oscillation, while the 10–10.5, 20–21 and 60–62 year cycles are synchronous to solar and heliospheric planetary oscillations. We show that the IPCC GCM’s claim that all warming observed from 1970 to 2000 has been anthropogenically induced is erroneous because of the GCM failure in reconstructing the quasi 20-year and 60-year climatic cycles. Finally, we show how the presence of these large natural cycles can be used to correct the IPCC projected anthropogenic warming trend for the 21st century. By combining this corrected trend with the natural cycles, we show that the temperature may not significantly increase during the next 30 years mostly because of the negative phase of the 60-year cycle. If multisecular natural cycles (which according to some authors have significantly contributed to the observed 1700–2010 warming and may contribute to an additional natural cooling by 2100) are ignored, the same IPCC projected anthropogenic emissions would imply a global warming by about 0.3–1.2 °C by 2100, contrary to the IPCC 1.0–3.6 °C projected warming. The results of this paper reinforce previous claims that the relevant physical mechanisms that explain the detected climatic cycles are still missing in the current GCMs and that climate variations at the multidecadal scales are astronomically induced and, in first approximation, can be forecast.

Get notified when a new post is published.
Subscribe today!
5 1 vote
Article Rating
119 Comments
Inline Feedbacks
View all comments
January 10, 2012 7:12 am

Vukcevic,
you really seems a broken disk. You continuously refer to CET record ignoring all my explanation.
Did you take off the volcano signal first? Have you compared CET records with the other records we have? Have you adjusted the analysis by taking into account influences from other phenomena that could disrupt the record?
Do you want to understand that CET record is a local record referring to a square of a 200 hundred miles of length that is a microscopic percentage of the global surface?
Have you read my papers?
For example:
A. Mazzarella and N. Scafetta, “Evidences for a quasi 60-year North Atlantic Oscillation since 1700 and its meaning for global climate change,” Theor. Appl. Climatol., DOI 10.1007/s00704-011-0499-4 (2011).
N. Scafetta, “A shared frequency set between the historical mid-latitude aurora records and the global surface temperature” Journal of Atmospheric and Solar-Terrestrial Physics 74, 145-163 (2011). DOI: 10.1016/j.jastp.2011.10.013.
Did you look at the numerous references my paper contains? you were not even able to see just one reference I added above.
Ok, let us try again, let us talk about only one case for simplicity. Look here
http://www.vliz.be/imisdocs/publications/218039.pdf
look at figure 5.
Describe me exctly what that figure shows, so we know that you gave a look at it. Note that figure is made of multiple figures. I am asking you to describe with precision what each of those figures show.
If you give me the impression that you are looking at the figures, we can try to analyze another paper.

January 10, 2012 9:13 am

Dr. Scafetta
1. I think you may be missing an important point: for good long term temperature study you need good long term record, and the CET is the longest and the most reliable. Further more it is next door to the AMO which is the most influential factor, at least for the period or the reliable global temperatures instrumental records:
http://www.vukcevic.talktalk.net/GT-AMO.htm
Global temperature is just a bit more than a carbon copy of the North Atlantic SST (the AMO) and there are good reasons for that.
So my advice is get to know CET, get to know North Atlantic, the rest comes naturally.
2. Paleo “proxies” as tree rings, ice cores, sediments, coral reefs etc, are only direction pointers and not reliable numerical references, hence I do not waist to much time on that.
3. You should find some conciliation in a short note at the end of the web page:
http://www.vukcevic.talktalk.net/CET-SW.htm
stating: There is strong 22 year component, coinciding with the Solar magnetic (Hale) cycle, which is not used in the reconstruction. It can be noted that the substantial drop in temperatures (2020-2050) predicted by the extrapolation is in a good agreement with a similar drop in the solar magnetic activity as obtained by extrapolation of totally different and unrelated equation
http://www.vukcevic.talktalk.net/NFC7a.htm

I leave you to consider the numbers in the equation related to the Hale cycle graph of the sun’s polar magnetic field, which is an excellent precursor to the following SS cycle.
Analysing another paper? No thanks, I work to a different time table
Its time for me to move on.
Good luck.

January 10, 2012 10:34 am

@Vukcevic, “I think you may be missing an important point: for good long term temperature study you need good long term record, and the CET is the longest and the most reliable. ”
Actually, that is not true. You may be interested in this work (among my references that you refuse to read)
Camuffo, D., et al., 2010.
500-year temperature reconstruction in the Mediterranean Basin by means of documentary data and instrumental observations.
Climatic Change 101, 169–199.
http://www.springerlink.com/content/q36j5713138wt30p
Here there is a claim of a quasi 60-year cycle in the temperature records since 1650.
You may also be interested in reading this (free) book
http://lyubushin.hotbox.ru/Climate_Changes_and_Fish_Productivity.pdf

January 10, 2012 12:30 pm

Theo Goodwin said January 9, 2012 at 3:32 pm

“Models cannot be used to predict anything. You, like the vast majority of climate scientists today, show that you are ignorant of the differences between models and theories. Theories can be used for prediction but models serve only to reproduce (reconstruct, in your terms) reality. Now, how can you predict anything from a reconstruction of reality?”
“Newton’s theory is a physical theory and its several hypotheses are well confirmed as it applies to our solar system. Of course it does not provide the detail or reach of Einstein’s theory but it does just fine in our solar system.”
““You have failed to grasp the difference between physical hypotheses and models. Physical hypotheses describe natural regularities. The key word here is “describe.” Physical hypotheses are about some aspect of physical reality and the true or really well confirmed physical hypotheses (which make up mature theories, eventually) tell us what that reality is.”

Models are used to predict things all the time. Planetariums are models of the solar system for example and are used to predict the future positions of the planets relative to each other.
Newton had no Theory of Gravitation and categorically refused to provide one. Newton showed that there was a mathematical equation linking the motions of the planets and every other body with mass. This is Newton’s Law of Gravity and is used to construct planetariums. Neither of the two competing theories of gravitation (quantum/relativity) have any relevance for building planetariums (models).
A hypothesis is a provisional theory. It must accord with known facts, and serves as a starting-point for further investigation by which a theory may be arrived at. A theory is a system of ideas or statements held as an explanation of a group of facts or phenomena. A Hypothesis that has been corroborated by observation or experiment, and is accepted as accounting for the known facts is a Theory. In current philosophy of science the latter distinction has little importance. While theories can be falsified, few believe they can be proved. Using Popper’s terminology, hypotheses and theories are both conjectures.
Nicola, your modelling looks interesting. The proof of the pudding will be in the eating 🙂

January 10, 2012 12:44 pm

Nicola Scafetta said January 10, 2012 at 10:34 am

“You may also be interested in reading this (free) book
http://lyubushin.hotbox.ru/Climate_Changes_and_Fish_Productivity.pdf

Nicola, I get a 404 on this.

January 10, 2012 1:38 pm

It goes to show that chaos is not the observed disorder of things, it is the lack (disorder) of knowledge about the observed order of things. Facts do not confer wisdom, but it is wisdom that discovers the context of the facts.
Doctor Scafetta, there is also a millennial scale that needs to be added to your harmonic, that of Earth’s Obliquity. Observing the graphs for the Ice Ages and Obliquity, one thing stands out: The average length of an ice age event is 100k years, that average is composed of 80k and 120k events. While the earth does not exit an Ice Age every 41k years according to the obliquity cycle when it achieves it’s maximum angle of >24 degrees, over the 10 recorded Ice Age events, EVERY TIME WITHOUT FAIL as obliquity drops below 23.5% an Ice Age STARTS. We are now below 23.5 degrees of obliquity. Superimpose figure 5 with the Earth’s obliquity and then recalculate using this trend in your figure 9 for more accurate results.

January 10, 2012 3:07 pm

Theo Goodwin says:
January 9, 2012 at 1:19 pm
“”What does a circle describe? A point? A sine curve? None of them describe anything. To claim that they can describe the “dynamical evolution of the climate system” is totally without meaning.””
For a look at how well the solar/lunar cycle repeats at the ~18 year period, click link below, scroll down to the forecast maps for the 31 of January of this year 2012. where the comparison between the composite forecast map and the four maps of the temperatures from the past periods can be checked for similarities between cycles over time.
http://tallbloke.wordpress.com/aerology-forecast-verfication/

January 10, 2012 4:08 pm

Thank you very much Dr. Scafetta,
Very interesting article. The way I read it, your model does not foresee much of a global cooling continuing for much longer or going much more cooler from 2001.
I hope you are right!

January 10, 2012 6:51 pm

thepompousgit says: “….”
do not click on the link (for some reason it does not work)
http://lyubushin.hotbox.ru/Climate_Changes_and_Fish_Productivity.pdf
you need to copy and past the link in the browser.
dscott says:
“It goes to show that chaos is not the observed disorder of things, it is the lack (disorder) of knowledge about the observed order of things.”
that is an interesting sentence 🙂

sky
January 10, 2012 7:18 pm

The linear trend plus sinusoids plus noise model is so simplistic that the success of any prediction based on it will be more a matter of luck than of scientific skill. Aside from the tides, geophysical variables are almost never strictly periodic; chaotic randomness is the general rule. What makes reliance on harmonic components highly precarious here is that even Scafetta’s analyses do not show the sharp spectral peaks characteristic of narrow-band processes, which might be usefully approximated over short prediction horizons by pure sinusoids. The situation calls for proper predictive filters rather than curve fitting.

January 10, 2012 7:35 pm

Nicola Scafetta sayid January 10, 2012 at 6:51 pm

do not click on the link (for some reason it does not work)
http://lyubushin.hotbox.ru/Climate_Changes_and_Fish_Productivity.pdf
you need to copy and past the link in the browser.

Thanks 🙂

dscott says:
“It goes to show that chaos is not the observed disorder of things, it is the lack (disorder) of knowledge about the observed order of things.”
that is an interesting sentence 🙂

Shannon — Information Theory. Very interesting…

Paul_K
January 10, 2012 9:27 pm

Dr Scafetta,
I recently posted a highly relevant article on Lucia’s site here:
http://rankexploits.com/musings/2011/noisy-blue-ocean-blue-suede-shoes-and-agw-attribution/
It considers the inversion of the temperature series (Hadcrut3 used in the aericle) to the input flux forcings i.e. the flux required to give us the exact reproduction of the temperature series, using in this instance the same assumed sensitivity as the GISS-ER model.
Decomposition of the resulting flux data reveals the very low frequency component to be a smooth convex curve over the entire instrument series. This is not compatible with your assuming a quadratic fit for the very low frequency component (periodicity > 60 years) in the temperature series in history, since this translates into a straight line in the flux data. Nor is it compatible with your extrapolation of the very low frequency component using a linear extrapolation, since this translates into a constant value in the flux data.
In fact, it is readily shown that the statistics of temperature fit obtained using the abstracted flux curve are superior to your assumption of a quadratic in temperature/straight line in flux.
I really do urge you to read the article. It should not only improve your historical fit, but also potentially offers firmer ground on which to base your prognosis.

January 11, 2012 4:55 am

Paul_K says:
January 10, 2012 at 9:27 pm
Paul, the quadratic curve is just a geometrical way to captures the temperature trending from 1850 to 2000. There is nothing wrong in using it. It is just a first+second order approximation of the trending. As esplained in the paper many times, the quadratic trending is not part of the physical model. About the linear trending, that is what the IPCC does. So, I use the same approximation.
You need to read carefully my paper to understand the reasoning.
If other observables are used instead of the temperature the geometry may appear different, of course. It is always good to start using simple models, then it is possible to search for an improvement of the models.

Pterostyrax
January 11, 2012 2:06 pm

Interesting article. One question. Why do the initial (1850) temperature anomolies differ so significantly from your initial temperature anomolies in the lower part of figure 1?

January 11, 2012 3:11 pm

@Pterostyrax says:
the quality of the temperature data decreases as we go back in time. So, as you go back in time the record become more noisy.
This ca be see more easily in this my publication, figure 4
http://www.fel.duke.edu/~scafetta/pdf/PRE26303.pdf
There has been published another comment here, people may be interested to read:
http://www.forbes.com/sites/larrybell/2012/01/10/global-warming-no-natural-predictable-climate-change/

January 11, 2012 4:18 pm

Dr. Scafetta,
I have published an article on my page “Climate Change (“Global Warming”?) – The cyclic nature of Earth’s climate” quoting the abstract of your paper “Testing an Astronomically Based Decadal-Scale Empirical Harmonic Climate Model vs. the IPCC (2007) General Circulation Climate Models” at http://www.oarval.org/ClimateChange.htm (Spanish at http://www.oarval.org/CambioClima.htm).
I hope you approve of it. If you have any comments, please let me know.

January 11, 2012 7:21 pm

Andres, thank you
nice web-page with a lot of information!

Mark T
January 11, 2012 7:41 pm

Re: quadratic interpolation… one of the most interersting adaptive equalization methods I have seen is based on negentropy, approximated by squaring the kurtosis (fourth order), an eighth order equation. Interestingly, under proper conditions, the result collapses to a second order equation known as a “minimum output energy” solution. In general, the mechanism is referred to as independent component analysis, a stricter cousin of principal component analysis, i.e. the same PCA that mann the physics dropout made infamous.
Mark

tallbloke
January 12, 2012 2:37 pm

For those interested, there will be a post on my blog soon which deals with part of the topic.
I understand why Anthony doesn’t want the subject discussed here, no matter well established it is in the scientific literature. To quote Ivanka Charvatova:
“I represent our institute in the Czech National Climate Programme. These people “research” only greenhouse effect vs temperatures. I call them “heaters”. Sometimes I feel like a lone Hussite warrior – myself against all. They deny the existence of solar influence on climate let alone the influence of the whole solar system. Most of them refuse to talk to me, most of them even do not say hello, when we meet. Even now when many world journals publish articles about the influence of the Sun on climate. Probably this requires more time. Many discoveries had to wait, some very long. I do not waste my time fighting windmills. God will sort it out when the right time comes.”
http://tallbloke.wordpress.com/2011/06/10/interview-with-ivanka-charvatova-is-climate-change-caused-by-solar-inertial-motion/

Joachim Seifert
January 14, 2012 12:05 pm

Dr. Scafetta, good work, good article abstract with resulting conclusions…..
It explains well the stepwise shape of GMT: Planetary (Jupiter and Saturn) oscillations as
physical 3-body-problem with Jup+Sat as 3.body gravitation force pulling/pushing the
real Earth’s trajectory further/closer to the Sun (less/more RF)……
……as you point out: “The actual physical mechanism…..of cycles is still obscure…”
But, analyzing the planetary (now its time for the planet Earth) oscillation (or Libration:
see Wikipedia: animated picture for the Moon), [other terms: perturbation, ligation, osculatio]
we will detect further RF (radiative forcing)…..and be on the bottom of this “Copernican approach…”
I hope I will have my paper ready by May/June on the subject and will send you then an authors
copy…..
JS

Johnnythelowery
January 16, 2012 5:39 pm

Nicola: I don’t know if you are there still or if you’ll see this. But I was wondering what your first thoughts, off the top of your head, are regarding this thought. I was impressed with Brian Cox’s ‘Wonders of the Solar Sytem’. What stuck in my head was his visit to the Iguazu (sp?) River. He stated that the river (level or flow rate??) tracks sunspots. He stated that there was no known connection. I discussed it here WUWT and asked Lief why this was not interesting. He said correlation is not causation. While the Iguazu does, many or nearly all rivers, including the Amazon don’t. Made sense to me…but……why does the Iguazu track Sunspots. I puzzled it but dropped it but often wondered about this. It would have to be an anomaly that only affected that particular feeder area of the Iguazu. What on earth could it be? So i’m sitting, minding my own business watching a youtube clip below. It seems that, the Hubble was having trouble with a couple of instruments….but only over Brazil/Argentina (..or in the ‘neck of the woods of the Iguazu). NASA gave this patch which affected their instruments a name: .the Sourth Atlantic Anomaly. To prevent this, Nasa switched off the instruments while Hubble flew through the South Atlantic Anomaly. So, perhaps, the phenomenon driving the Iguazu is as much to do with the Sun as it is the local oddities in earth’s magntic field. Here is the clip. The part relevant to the above is 00.43 and from then on. Thx..

Jose_X
February 9, 2012 7:43 pm

I would like to read this paper carefully, but the impression I get if I understand the bits I read is that the author is just finding patterns of the oscillatory nature and then suggesting those patterns will hold and can be used to predict the future. The question here is, specifically for the 60 year cycle case, how many previous 60 year cycles has he analyzed since we have only had at most 3 such cycles of quasi decent temperature measurements. [I’m am only going back at most into the middle 1800s since such a cycle obviously is not very large in amplitude and would disappear in the error bars of the proxy reconstructions that go further back.] This study is interesting as a way to help zoom in on possible physics we might not yet be clearly aware of, but until you identify the physics, you won’t get far in any scientific community. .. Anyway, I haven’t read the posting or the paper yet.

Jose_X
February 9, 2012 8:15 pm

…Another concern I have (prior to reading the material beyond a piece of the introduction.. and judging by a random comment I read) is that the IPCC models are perhaps being attacked for not meeting at every pixel, yet that is not necessary if you stay within error bounds. I am not saying the current climate computer models can stand no improvements. I’m saying that I can come up with “an infinite” number of curves (eg, by hand or using a computer if I want to get fancy) that come closer than any given model yet be absolutely incorrect about the future. I can draw a line right through the average of the current temps and then after 2012 do 5 loop the loops and end with a dive towards 0K before I even get to 2100. The possibilities are limitless! As for predictions based on cycle analysis. We can use the stock market as an example. Since when does cycle analysis predict the stock market? It doesn’t. [Of course, man has a much greater say over market prices than we do over earth surface temps.] Remember that we didn’t get to the moon or create modern technology via models that extent historical cyclic patterns blindly to the actual physics going on. We progress because of improved understanding of the physics. .. Anyway, despite what I just said, I can imagine how a paper on this topic might make a good contribution to climate science.

Jose_X
February 19, 2012 8:33 pm

I have a lot of the paper to read, but I have skimmed enough to be critical of their methods and conclusions.
The predictably rising CO2 (greenhouse effect) contribution to temperature does not show up in a pure cyclical (harmonic) analysis. This paper does more than harmonic analysis, but the coefficients derived from the harmonic parts can get thrown off to the extent the superimposed trends don’t fit the data well. I haven’t done the math yet, but it’s possible this skewing effect cancels out some or all of what the authors’ believe the IPCC incorrectly included in their projections. Instead, this analysis can easily end up attributing to a cyclical pattern what is more appropriately a part of a growing non-cyclical trend that might have in fact been properly identified by the IPCC.
The full analysis in this paper also includes two “trend” components that are superimposed on the cycles, a linear and a quadratic trend. Without at least these functions, their analysis would not work since, as covered extensively in the literature, the temperatures from 1850 to 2000 have had an upwards, non-cyclical trend due to CO2 greenhouse effect. Essentially, the authors used a quadratic function to simulate the upward trend from 1850 to 2000, and then, convenient to their thesis, assume that from 2000 onward, the trend no longer has a quadratically increasing component but devine it will simply have a linear component.
In conclusion (for now), the authors recognized you needed a function that grows faster than linear in order to properly capture the trend from 1850 – 2000, but then decide the future temperatures of the planet will no longer have such a quadratic trend and instead merely have a linear trend. Why the authors change gears like this is a detail I hope to discover as I read the paper. If the authors don’t provide a solid physical explanation for why we should no longer expect to see the observed quadratic trend we have been seeing since 1850 and instead expect no more than a linear trend after 2000, the paper would appear to fall flat on its face (at least as concerns its conclusions that climate sensitivity should be about 1/3 what the IPCC predicts). For those who want to explore this, check out in particular the definition of q(t) as defined on page 10, eqns 9 and 10.

Joachim Seifert
Reply to  Jose_X
February 20, 2012 8:55 am

THe UPWARD going linear warming trend (1850-2000) cannot be explained with Scafetta, but the
cyclic shape of the upward trend…..
The upward going trend has to be added to Scafetta, see Literature on German Amazon.de
ISBN 978-3-86805-604-4, which starts at the bottom of the LIA (1650) and culminates
by year 2000 into the temp plateau of the 21. Cty…… BUT this NOT linearly, but in cycle steps,
(Scafetta achievement) which have an astronomical cause (3-body-gravitation-taking Saturn/Jupiter into account)….
In order to assess the full message/causes of global warming/climate change, you need
both literatures together….. which complement each other…
JS

Jose_X
February 20, 2012 11:04 am

Joachim Seifert>> THe UPWARD going linear warming trend (1850-2000)
In this paper, in contrast to a paper he co-authored published earlier last year (which I recently glanced at after downloading from a different website), Scafetta did not use a linear trend for the 1850-2000 period but used instead a quadratic trend. See this via eqn 10 on page 10 and eqn 4 on page 6 [the formal paper starts of page 19 of the pdf linked on the top blog posting].
Let me add:
Scafetta then switched to a linear trend for the 2000-2100 prediction period. This is shown in eqn 10 and eqn 9. [Note that p(2000) is a constant value not a quadratic function.]
Had the paper stayed with the same exact quadratic trend that fit the 1850-2000 period, the temperature value at 2100 would have been (.000049*(250)^2-.0035*250-.3)-(.000049*(150)^2-.0035*150-.3) = (1.89)-(.28) = 1.61 C higher. According to his chart, Fig 5 on page 11, this would put his 2100 projection mean, not at about 1.15 but at about 2.75.
I am not sure how this 2100 projection compares with 2xCO2 and climate sensitivity (which is defined as an ideal value representing 2xCO2 equilibrium.. which would be a number higher than the temp on the date 2xCO2 is first achieved), but I think those figures are in the ballpark.. putting climate sensitivity not too far from 3.
I think Scafetta’s reasoning for switching over abruptly to a linear trend from 2000 onward is: “Thus, the above estimated 1.30 C/century anthropogenic warming trending is likely an upper limit estimate. As a lower limit we can reasonably assume the 0.66 +/- 16 C/century, as estimated in Loehle and Scafetta (2011).”
If Scafetta is right that the warming trend will be linear in growth as he estimated from 2000 to 2100, then the 2100 temp projection is probably a decent one, at least if the assumption stated at the beginning of the paper holds, that the climate goes through natural cycles of fixed durations and amplitudes [note the constant C_1 and C_2 values of eqn 3, constant C_3 and C_4 values of eqn 7, and the constant periods of all of those cosines]. However, if the trends continue roughly as he modeled from the 1850-2000 period, then his projections for 2100 were about 1.6 C on the low side.
I think this sort of analysis is interesting and can provide insight and probably a first guess estimate, but it is not based on physics.

Joachim Seifert
Reply to  Jose_X
February 20, 2012 12:49 pm

Jose X: Scafettas paper explains very well the influence of the astronomic 3-body-gravitation
cycles which result in a STEPWISE (flat-steep-flat-steep) form of the climate (60-20-40 year)
cycles also given in a recent paper of Akasofu, S.-I: “On the recovery from the LIttle Ice Age”
in Natural Science 2 p 1211-1224 …..where ” Earth’s recovery from the LIA proceeded with a
roughly 0.5 C/century recovery rate in a LINEAR manner…… the reason and the calculation for this recovery see my booklet, as pointed out before…..
Scafetta/s analysis does the following, quote Akasofu: Focusses on modulation of multidecadal
oscillations of 50-60 years superimposed on the recovery trend….and from which we can see
causation of halting of warming after 2000….
In short: Scafetta analyzes only the superimposed trend……
This occurs on top of the linearly mannered recovery trend- which he just takes as a fact without
cause analysis from 3 to 4 cycles since 1850……
Therefore, his superimposed trend and the recovery trend from me as joint analysis would
provide you the whole picture…..You cannot blame Scafetta, for not having integrated the full
recovery analysis……but do not worry, there are more months ahead and you will get it later
in the year…..
JS