Predictions Of Global Mean Temperatures & IPCC Projections

Guest post by Girma Orssengo, B. Tech, MASc, PhD

The Intergovernmental Panel on Climate Change (IPCC) claims that human emission of CO2 causes catastrophic global warming. When such extraordinary claim is made, every one with background in science has to look at the data and verify whether the claim is justified or not. In this article, a mathematical model was developed that agrees with observed Global Mean Temperature Anomaly (GMTA), and its prediction shows global cooling by about 0.42 deg C until 2030. Also, comparison of observed increase in human emission of CO2 with increase in GMTA during the 20th century shows no relationship between the two. As a result, the claim by the IPCC of climate catastrophe is not supported by the data.

Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming. Fortunately, the global warming claim by the IPCC that “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario” [1] is not supported by observations as shown in Figure 1, which shows a plateau for the global mean temperature trend for the last decade.

.”]

Figure 1 also shows that the observed temperatures are even less than the IPCC projections for emission held constant at the 2000 level.

As a result, the statement we often hear from authorities like UN Secretary-General Ban Ki-moon that “climate change is accelerating at a much faster pace than was previously thought by scientists” [3] is incorrect.

Thanks for the release of private emails of climate scientists, we can now learn from their own words whether global warming “is accelerating at a much faster pace” or not. In an email dated 3-Jan-2009, Mike MacCracken wrote to Phil Jones, Folland and Chris [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong. Otherwise, the Skeptics will be all over us–the world is really cooling, the models are no good, etc. And all this just as the US is about ready to get serious on the issue.

We all, and you all in particular, need to be prepared.

Similarly, in an email dated 24-Oct-2008, Mick Kelly wrote to Phil Jones [5]:

Just updated my global temperature trend graphic for a public talk and noted that the level has really been quite stable since 2000 or so and 2008 doesn’t look too hot.

Be awkward if we went through a early 1940s type swing!

The above statements from the climategate emails conclusively prove that the widely used phrase by authorities in public that global warming “is accelerating at a much faster pace” is supported neither by climate scientists in private nor by the observed data.

Thanks also goes to the Climate Research Unit (CRU) of the Hadley Center for daring to publish global mean temperature data that is “quite stable since 2000”, which is contrary to IPCC projections of 0.2 deg C warming per decade. If the CRU had not done this, we would have been forced to swallow the extremely irrational concept that the gas CO2, a plant food, i.e. foundation of life, is a pollutant because it causes catastrophic global warming.

As IPCC’s “models are no good”, it is the objective of this article to develop a valid mathematical global mean temperature model based on observed temperature patterns.

Mathematical Model For The Global Mean Temperature Anomaly (GMTA) Based On Observed Temperature Patterns

The Global Mean Temperature Anomaly (GMTA) data from the Climate Research Unit (CRU) of the Hadley Center shown in Figure 2 will be used to develop the mathematical model. In this article, the observed GMTA data from the CRU are assumed to be valid.

Examination of Figure 2 shows that the globe is warming at a linear rate as shown by the least square trend central line given by the equation

Linear anomaly in deg C = 0.0059*(Year-1880) – 0.52 Equation 1

Figure 2 also shows that superimposed on this linear anomaly line there is an oscillating anomaly that gives the Global Mean Temperature Anomaly (GMTA) the characteristics summarized in Table 1.

Table 1. Characteristics of the observed Global Mean Temperature Anomaly (GMTA) shown in Figure 2.

From 1880s to 1910s

End of warming, plateau at –0.2 deg C & then cooling trend

From 1910s to 1940s

End of cooling, plateau at –0.6 deg C & then warming trend

From 1940s to 1970s

End of warming, plateau at 0.1 deg C & then cooling trend

From 1970s to 2000s

End of cooling, plateau at –0.3 deg C & then warming trend

From 2000s to 2030s

End of warming, plateau at 0.5 deg C & then ? trend

A mathematical model can be developed that satisfies the requirements listed in Table 1. If the model to be developed gives good approximation for the GMTA values at its turning points (plateaus) and the GMTA trends between its successive turning points as summarized in Table 1, the model may be used for prediction.

.”]

For the oscillating anomaly, the sinusoidal function cosine meets the requirements listed in Table 1. From Figure 2, the amplitude of the oscillating anomaly is given by the vertical distance in deg C from the central linear anomaly line to either the top or bottom parallel lines, and it is about 0.3 deg C. From Figure 2, the oscillating anomaly was at its maximum in the 1880s, 1940s, & 2000s; it was at its minimum in the 1910s and 1970s. The years between successive maxima or minima of the oscillating anomaly is the period of the cosine function, and it is about 1940–1880=1970–1910=60 years. For the cosine function, once its amplitude of 0.3 deg C and its period of 60 years are determined, the mathematical equation for the oscillating anomaly, for the years starting from 1880, can be written as

Oscillating anomaly in deg C = 0.3*Cos(((Year-1880)/60)*2*3.1416) Equation 2

In the above equation, the factor 2*3.1416 is used to convert the argument of the cosine function to radians, which is required for computation in Microsoft Excel. If the angle required is in degrees, replace 2*3.1416 with 360.

Combining the linear anomaly given by Equation 1 and the oscillating anomaly given by Equation 2 gives the equation for the Global Mean Temperature Anomaly (GMTA) in deg C for the years since 1880 as

GMTA = 0.0059*(Year-1880) – 0.52 + 0.3*Cos(((Year-1880)/60)*2*3.1416) Equation 3

The validity of this model may be verified by comparing its estimate with observed values at the GMTA turning points as summarized in Table 2.

Table 2. Comparison of the model with observations for GMTA in deg C at its turning points.

Year

Observed (Table 1)

Model

(Equation 3)

Warming plateau for the 1880s

-0.2

-0.22

Cooling plateau for the 1910s

-0.6

-0.64

Warming plateau for the 1940s

+0.1

+0.13

Cooling plateau for the 1970s

-0.3

-0.29

Warming plateau for the 2000s

+0.5

+0.48

Table 2 shows excellent agreement for the GMTA values between observation and mathematical model for all observed GMTA turning points.

A graph of the GMTA model given by Equation 3 is shown in Figure 3, which includes the observed GMTA and short-term IPCC projections for GMTA from 2000 to 2025. In addition to the verification shown in Table 2, Figure 3 shows good agreement for the GMTA trends throughout observed temperature records, so the model may be used for prediction. As a result, Figure 3 includes GMTA predictions until 2100, where the year and the corresponding GMTA values are given in parentheses for all the GMTA turning points.

As shown in Figure 3, a slight discrepancy exist between observed and model GMTA values at the end of the 1890s when the observed values were significantly warmer than the model pattern, and in the 1950s when the observed values were significantly colder than the model pattern.

Figure 3. Comparison of observed Global Yearly Mean Temperature Anomaly (GMTA) with models.

From the model in Figure 3, during the observed temperature record, there were two global warming phases. The first was from 1910 to 1940 with a warming of 0.13+0.64=0.77 deg C in 30 years. The second was from 1970 to 2000 with a warming of 0.48+0.29=0.77 deg C in 30 years. Note that both warming phases have an identical increase in GMTA of 0.77 deg C in 30 years, which gives an average warming rate of (0.77/30)*10=0.26 deg C per decade.

From the model in Figure 3, during the observed temperature record, there were two global cooling phases. The first was from 1880 to 1910 with a cooling of 0.64-0.22=0.42 deg C in 30 years. The second was from 1940 to 1970 with a cooling of 0.13+0.29=0.42 deg C in 30 years. Note that both cooling phases have an identical decrease in GMTA of 0.42 deg C in 30 years, which gives an average cooling rate of (0.42/30)*10=0.14 deg C per decade.

The above results for the normal ranges of GMTA determined from the model can also be calculated using simple geometry in Figure 2. In this figure, almost all observed GMTA values are enveloped by the two parallel lines that are 0.6 deg C apart. Therefore, as a first approximation, the normal range of GMTA is 0.6 deg C. From Figure 2, the period for a global warming or cooling phase is about 30 years. Therefore, as a first approximation, the normal rate of global warming or cooling is (0.6/30)*10=0.2 deg C per decade.

The above approximation of 0.6 deg C for the normal range of GMTA should be refined by including the effect of the linear warming anomaly given by Equation 1 of 0.006 deg C per year, which is the slope of the two envelope parallel lines in Figure 2. As the oscillating anomaly changes by 0.6 deg C in 30 years between its turning points, the linear anomaly increases by 0.006*30=0.18 deg C. Due to this persistent warming, instead of the GMTA increasing or decreasing by the same 0.6 deg C, it increases by 0.6+0.18=0.78 deg C during its warming phase, and decreases by 0.6–0.18=0.42 deg C during its cooling phase. As a result, the refined normal ranges of GMTA are 0.77 deg C in 30 years during its warming phase, and 0.42 deg C in 30 years during its cooling phase. These results for the normal ranges of GMTA obtained using simple geometry in Figure 2 agree with those obtained from the model in Figure 3.

Correlation of Model and Observed Global Mean Temperature Anomaly (GMTA)

In Table 2, data points for only five years were used to verify the validity of Equation 3 to model the observed data. However, it is important to verify how well the observed GMTA is modeled for any year.

Figure 4. Correlation between model and observed GMTA values. The model GMTA values are from Equation 3, and the observed GMTA values are from the Climate Research Unit shown in Figure 2.

How well the observed data is modeled can be established from a scatter plot of the observed and model GMTA values as shown in Figure 4. For example, for year 1998, the observed GMTA was 0.53 deg C and the model GMTA is 0.47 deg C. In Figure 4, for year 1998, the pair (0.47,0.53) is plotted as a dot. In a similar manner, all the paired data for model and observed GMTA values for years from 1880 to 2009 are plotted as shown in Figure 4.

Figure 4 shows a strong linear relationship (correlation coefficient, r=0.88) between the model and observed GMTA. With high correlation coefficient of 0.88, Figure 4 shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. The positive slope of the trend line indicates a positive relationship between model and observed GMTA. That is, global cooling from the model indicates observed global cooling, and global warming from the model indicates observed global warming.

Global Mean Temperature Prediction Calculations

The following patterns may be inferred from the graph of the Global Mean Temperature Anomaly (GMTA) model shown in Figure 3 for the data from the Climate Research Unit of the Hadley Center [2]:

  1. Year 1880 was the start of a cooling phase and had a GMTA of –0.22 deg C.

  2. During the global cooling phase, the GMTA decreases by 0.42 deg C in 30 years.

  3. Global cooling and warming phases alternate with each other.

  4. During the global warming phase, the GMTA increases by 0.77 deg C in 30 years.

The patterns in the list above are sufficient to estimate the GMTA values at all of its turning points since 1880.

For example, as year 1880 with GMTA of –0.22 deg C was the start of a cooling phase of 0.42 deg C in 30 years, the next GMTA turning point was near 1880+30=1910 with GMTA of –0.22–0.42=-0.64 deg C. This GMTA value for 1910 is shown as (1910,-0.64) in Figure 3.

As year 1910 with GMTA of –0.64 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1910+30=1940 with GMTA of 0.77–0.64=0.13 deg C. This GMTA value for 1940 is shown as (1940,0.13) in Figure 3.

As year 1940 with GMTA of 0.13 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point was near 1940+30=1970 with GMTA of 0.13–0.42=-0.29 deg C. This GMTA value for 1970 is shown as (1970,-0.29) in Figure 3.

As year 1970 with GMTA of -0.29 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1970+30=2000 with GMTA of 0.77–0.29=0.48 deg C. This GMTA value for 2000 is shown as (2000,0.48) in Figure 3.

As the GMTA values calculated above using the global temperature patterns listed at the beginning of this section give good approximation of observed GMTA values at all GMTA turning points (1880, 1910, 1940, 1970 & 2000), it is reasonable to assume that the patterns may also be used for prediction.

As a result, as year 2000 with GMTA of 0.48 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point will be near 2000+30=2030 with GMTA of 0.48–0.42=0.06 deg C. This GMTA value for 2030 is shown as (2030,0.06) in Figure 3.

In a similar manner, the GMTA values for the remaining GMTA turning points for this century can be calculated, and the results are shown in Figure 3.

Figure 3 shows a very interesting result that for the 20th century, the global warming from 1910 to 2000 was 0.48+0.64=1.12 deg C. In contrast, for the 21st century, the change in GMTA from 2000 to 2090 will be only 0.41–0.48=-0.07 deg C. This means that there will be little change in the GMTA for the 21st century! Why?

Why Does The Same Model Give A Global Warming Of About 1 deg C For The 20th Century But Nearly None For The 21st Century?

According to the data shown in Figure 3, it is true that the global warming of the 20th century was unprecedented. As a result, it is true that the corresponding sea level rise, melting of sea ice or the corresponding climate change in general were unprecedented. However, this was because the century started when the oscillating anomaly was at its minimum near 1910 with GMTA of –0.64 deg C and ended when it was at its maximum near 2000 with GMTA of 0.48 deg C, giving a large global warming of 0.48+0.64=1.12 deg C. This large warming was due to the rare events of two global warming phases of 0.77 deg C each but only one cooling phase of 0.44 deg C occurring in the 20th century, giving a global warming of 2*0.77-0.42=1.12 deg C.

In contrast to the 20th century, from Figure 3, there will be nearly no change in GMTA in the 21st century. This is because the century started when the oscillating anomaly was at its maximum near 2000 with GMTA of 0.48 deg C and will end when it is at its minimum near 2090 with GMTA of 0.41 deg C, giving a negligible change in GMTA of 0.41-0.48=-0.07 deg C. This negligible change in GMTA is due to the rare events of two global cooling phases of 0.42 deg C each but only one warming phase of 0.77 deg C occurring in the 21st century, giving the negligible change in GMTA of 0.77-2*0.42=-0.07 deg C. Note that this little change in GMTA for the 21st century is identical to that from 1880 to 1970, which makes the global warming from 1970 to 2000 by 0.77 deg C appear to be abnormally high.

If the period for a century had been 120 years, we wouldn’t have this conundrum of nearly 1 deg C warming in the 20th century but nearly none in the next!

Ocean Current Cycles

One of the most important variables that affect global mean surface temperature is ocean current cycles. The rising of cold water from the bottom of the sea to its surface results in colder global mean surface temperature; weakening of this movement results in warmer global mean surface temperature. Various ocean cycles have been identified. The most relevant to global mean temperature turning points is the 20 to 30 years long ocean cycle called Pacific Decadal Oscillation (PDO) [6]:

Several independent studies find evidence for just two full PDO cycles in the past century: “cool” PDO regimes prevailed from 1890-1924 and again from 1947-1976, while “warm” PDO regimes dominated from 1925-1946 and from 1977 through (at least) the mid-1990’s (Mantua et al. 1997, Minobe 1997).

These cool and warm PDO regimes correlate well with the cooling and warming phases of GMTA shown in Figure 3.

The model in Figure 3 predicts global cooling until 2030. This result is also supported by shifts in PDO that occurred at the end of the last century, which is expected to result in global cooling until about 2030 [7].

Effect Of CO2 Emission On Global Mean Temperature

Examination of Figure 3 shows that the Global Mean Temperature Anomaly (GMTA) for 1940 of 0.13 deg C is greater than that for 1880 of –0.22 deg C. Also, the GMTA for 2000 of 0.48 deg C is greater than that for 1940 of 0.13 deg C. This means that the GMTA value, when the oscillating anomaly is at its maximum, increases in every new cycle. Is this global warming caused by human emission of CO2?

The data required to establish the effect of CO2 emission on global mean temperature already exist. The global mean temperature data are available from the Climate Research Unit of the Hadley Centre shown in Figure 3, and the CO2 emission data are available from the Carbon Dioxide Information Analysis Centre [8]. For the period from 1880 to 1940, the average emission of CO2 was about 0.8 G-ton, and the increase in the GMTA was 0.13+0.22=0.35 deg C. For the period from 1940 to 2000, the average emission of CO2 was about 4 G-ton, but the increase in GMTA was the same 0.48-0.13=0.35 deg C. This means that an increase in CO2 emission by 4/0.8=5-fold has no effect in the increase in the GMTA. This conclusively proves that the effect of 20th century human emission of CO2 on global mean temperature is nil.

Note that the increase in GMTA of 0.35 deg C from 1880 to 1940 (or from 1940 to 2000) in a 60 year period has a warming rate of 0.35/60=0.0058 deg per year, which is the slope of the linear anomaly given by Equation 1. As a result, the linear anomaly is not affected by CO2 emission. Obviously, as the oscillating anomaly is cyclic, it is not related to the 5-fold increase in human emission of CO2.

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil.

Further evidence for the non-existent relationship between CO2 and GMTA is IPCC’s projection of a global warming of 0.2 deg C per decade, while the observed GMTA trend was “quite stable since 2000” [5]. The evidence will be “unequivocal” if global cooling by about 0.42 deg C starts soon and continues until about 2030, as shown by the model in Figure 3. The IPCC projection for the GMTA for 2020 is 0.8 deg C, while the prediction from the model for this value is 0.2 deg C, a large discrepancy of 0.6 deg C. If this global cooling is confirmed, it will then be time to bury the theory that CO2, a plant food, causes catastrophic global warming. Fortunately, we don’t have to wait too long for the burial. Less than ten years. It will be cheering news!

IPCC Projections

According to the IPCC [1], “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario.”

IPCC explains this projection as shown in Figure 5 where GMTA trend lines were drawn for four periods from 2005 to 1856, 1906, 1956 & 1981. These trend lines give increasing warming rate from a low value of 0.045 deg C per decade for the RED trend line for the first period from 1856 to 2005, to a greater value of 0.074 deg C per decade for the PURPLE trend line for the second period from 1906 to 2005, to a still greater value of 0.128 deg C per decade for the ORANGE trend line for the third period from 1956 to 2005, and to a maximum value of 0.177 deg C per decade for the YELLOW trend line for the fourth period from 1981 to 2005. IPCC then concludes, “Note that for shorter recent periods, the slope is greater, indicating accelerated warming” [9].

If this IPCC interpretation is correct, catastrophic global warming is imminent, and it is justified for the world to be griped by fear of global warming. However, is IPCC’s “accelerated warming” conclusion shown in Figure 5 correct?

What the GMTA pattern in Figure 3 shows is that it has cooling and warming phases. As a result, in Figure 5, comparing the warming rate of one period that has only one warming phase with another period that has a combination of warming and cooling phases will obviously show the maximum warming rate for the first period. This is comparing apples to oranges.

Comparing apples to apples is to compare two periods that have the same number of cooling and/or warming phases.

.”]

One example of comparing apples to apples is to compare one period that has one warming phase with another that also has one warming phase. From Figure 3, two 30-year periods that have only one warming phase are the periods from 1910 to 1940 and from 1970 to 2000. For the period from 1910 to 1940, the increase in GMTA was 0.13+0.64=0.77 deg C, giving a warming rate of (0.77/30)*10=0.26 deg C per decade. Similarly, for the period from 1970 to 2000, the increase in GMTA was 0.48+0.29=0.77 deg C, giving an identical warming rate of 0.26 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1970 to 2000 compared to the period from 1910 to 1940.

A second example of comparing apples to apples is to compare one period that has one cooling and warming phases with another that also has one cooling and warming phases. From Figure 3, two 60-year periods that have only one cooling and warming phases are the periods from 1880 to 1940 and from 1940 to 2000. For the period from 1880 to 1940, the increase in GMTA was 0.13+0.22=0.35 deg C, giving a warming rate of (0.35/60)*10=0.06 deg C per decade. Similarly, for the period from 1940 to 2000, the increase in GMTA was 0.48-0.13=0.35 deg C, giving an identical warming rate of 0.06 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1940 to 2000 compared to the period from 1880 to 1940.

From the above analysis, IPCC’s conclusion of “accelerated warming” is incorrect, and its graph shown in Figure 5 is an incorrect interpretation of the data.

Based on observed GMTA pattern shown in Figure 3, a global warming phase lasts for 30 years, and it is followed by global cooling. As a result, the recent global warming phase that started in the 1970s ended in the 2000s as shown by the current GMTA plateau, and global cooling should follow. Therefore, IPCC’s projection for global warming of 0.2 deg C per decade for the next two decades is incorrect. Also, divergence between IPCC projections and observed values for the GMTA has started to be “discernible” since 2005 as shown in Figure 3.

According to the Occam’s Razor principle, given a choice between two explanations, choose the simplest one that requires the fewest assumptions. Instead of applying the Occam’s Razor principle by assuming the cause of GMTA turning points to be natural, the IPCC assumed the cause to be man-made [9]:

From about 1940 to 1970 the increasing industrialisation following World War II increased pollution in the Northern Hemisphere, contributing to cooling, and increases in carbon dioxide and other greenhouse gases dominate the observed warming after the mid-1970s.

Like in the 1880s & 1910s, what if the causes of the GMTA turning points in the 1940s and 1970s were also natural?

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil. Also, IPCC’s conclusion of “accelerated warming” shown in Figure 5 is incorrect.

What is the cause of the GMTA turning point from warming to plateau in the 2000s? Here is the suggestion by Mike MacCracken [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong.

According to the IPCC and the above suggestion, the 1940 GMTA turning point from global warming to cooling was caused by sulfates, the 1970 GMTA turning point from cooling to warming was caused by carbon dioxide, and the 2000 GMTA turning point from warming to plateau was caused by sulfates. It is interesting to note that sulfate and carbon dioxide gave the globe a 30-year alternate cooling and warming phases from 1940 to 2000. This is just absurd.

Instead of saying, “Be awkward if we went through a early 1940s type swing!” in private, but global warming “is accelerating at a much faster pace” in public, please release the world from the fear of climate catastrophe from use of fossil fuels, as this catastrophe is not supported by your own data. It is extremely callous not to do so.

Is the theory that “human emission of CO2 causes catastrophic global warming” one of the greatest blunders or something worse of “science”? We will find the unambiguous answer within the next ten years. Hope they don’t succeed in calling the plant food a pollutant and tax us before then.

==========================================

This document is also available as a PDF file, link below:

Predictions Of GMT

For any criticism, please leave a comment below, or contact me at orssengo@lycos.com

Girma J Orssengo

Bachelor of Technology in Mechanical Engineering, University of Calicut, Calicut, India

Master of Applied Science, University of British Columbia, Vancouver, Canada

Doctor of Philosophy, University of New South Wales, Sydney, Australia

===========================================

REFERENCES

[1] IPCC Fourth Assessment Report: Climate Change 2007

a warming of about 0.2°C per decade is projected”

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/spmsspm-projections-of.html

[2] Observed Global Mean Surface Temperatures from the Climate Research Unit of the Hadley Center.

http://www.woodfortrees.org/plot/hadcrut3vgl/compress:12/from:1880/plot/hadcrut3vgl/from:1880/trend/plot/hadcrut3vgl/from:1880/trend/offset:0.3/plot/hadcrut3vgl/from:1880/trend/offset:-0.3

[3] Climate Change Science Compendium 2009

is accelerating at a much faster pace”

http://www.unep.org/pdf/ccScienceCompendium2009/cc_ScienceCompendium2009_full_en.pdf

[4] Climategate Email from Mike MacCracken to Phil Jones, Folland and Chris

that explanation is wearing thin”

http://www.eastangliaemails.com/emails.php?eid=947&filename=1231166089.txt

[5] Climategate Email from Mick Kelly to Phil Jones

Be awkward if we went through a early 1940s type swing!

http://www.eastangliaemails.com/emails.php?eid=927&filename=1225026120.txt

[6] The Pacific Decadal Oscillation (PDO)

http://jisao.washington.edu/pdo/

[7] Pacific Ocean Showing Signs of Major Shifts in the Climate

http://www.nytimes.com/library/national/science/012000sci-environ-climate.html

[8] Carbon Dioxide Information Analysis Center

Global CO2 Emissions from Fossil-Fuel Burning, Cement Manufacture, and Gas Flaring

http://cdiac.ornl.gov/ftp/ndp030/global.1751_2006.ems

[9] Climate Change 2007: Working Group I: The Physical Science Basis

How are Temperatures on Earth Changing?

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faq-3-1.html

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

347 Comments
Inline Feedbacks
View all comments
April 25, 2010 4:40 pm

I have used this curve fitting technique on CO2, ice core, sea ice, and SST (http://www.kidswincom.net/climate.pdf) and your single cycle superimposed on a straight line is too simple to be used as a predictor. The data shows superimposed multiple wave length cycles and a linear fit is more likely a sigment of a long term cycle. A cycle common among the various data has a wave length of around 325 years. A 1100 year cycle could explain the MWP and LIA. These are natural cycles that CO2 follows rather than causes.

Editor
April 25, 2010 4:47 pm

Others have already commented along these lines, but WTH – if enough comments like these are added we could create a new ‘consensus’…..
This mathematical model just shows that there has been a 60-year cycle for the last 100+ years, that the 20th century warming has been overstated (and hence that AGW has been overstated, because AGW is based on the amount of warming in the 20thC), and that the IPCC climate models are unsafe (because they do not recognise and cannot explain the cycles).
But that’s about all.
Without an underlying mechanism, we have no way of knowing whether the 60-year cycle will continue in the same pattern. This new model therefore cannot be used for prediction.
Please note that the IPCC (Trenberth) has stated that its models cannot be used for prediction. The fact that the IPCC models have been ruthlessly milked for every possible prediction under the sun does not imply that this new model can be used for prediction.
What should happen – what should have started happening long ago – is that all model predictions should be scrapped, all political activity based on such predictions should cease, and scientists should get on with real research into the mechanisms of climate until the point is reached that we really do have understanding of how climate works.

April 25, 2010 4:49 pm

John Cooke (13:13:35) :
As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
I agree, this is just curve fitting a posteriori. But so is so much that goes for ‘science’ these days.

April 25, 2010 4:53 pm

Richard M (15:58:00) :
I tried out a stock fitting program which very accurately developed a fifth order equation for modeling past behaviour, but had no skill at predicting future behaviour.
“With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.”
Johnny Von Neumann

thelastpost
April 25, 2010 4:58 pm

The fitting to recent climate fluctuation presented by Girma Orssengo is in a way analogous to parts of Nicola Scafetta’s recent posting where he fitted the recent solar cycle (sunspot) wavetrain of 4 cycles to a historic wavetrain (at one of the previous solar minima) showing close similarity. OK no physical model, but if you can demonstrate enough oscillatory cycles, then characterising them and projecting forward is of some interest.

Bill Illis
April 25, 2010 4:59 pm

If one is looking for the driver of this 60 year cycle, it is most likely the Atlantic Multidecadal Oscillation (AMO).
Here is the AMO and ln (CO2) [for the upward trend] against Hadcrut3 back to 1854 on a monthly basis.
It is hard to miss the correlation (throw in some ENSO variability as well and one gets pretty close).
http://img121.imageshack.us/img121/9162/hadcrut3amoco2t.png

April 25, 2010 5:04 pm

Wow. I need to read this over a couple to times to make intelligent comments. However, We must I think be just a little careful with our language. This quote is example: “This conclusively proves that the effect of 20th century human emission of CO2 on global mean temperature is nil.” No it proves nothing, it does falsify the hypothesis that the global temperature increase during these periods was caused by increases in CO2. Remember please we do not prove things, we falsify them. Only the Sophists and Propagandists claim to do that which logically they can not prove something in science.
My wife is calling me to domestic duty….more on science later.

Bruce of Newcastle
April 25, 2010 5:05 pm

Similar study with a 65.7 year peak to peak sinusoidal curve fit to HadCRUT since 1850 here:
http://digitaldiatribes.wordpress.com/2009/02/10/deconstructing-the-hadcrut-data/
However the solar cycle 24 story may see the anomaly maundering off in a Dalton direction for another 1 C or more drop by 2030:
http://www.warwickhughes.com/agri/Solar_Arch_NY_Mar2_08.pdf
Plenty of grist in this paper to add a solar cycle periodicity term to the sinusoidal model.

MartinGAtkins
April 25, 2010 5:06 pm

Richard Telford (13:23:18) :
Your model is very useful. Extrapolating back in time to the early part of the last ice age, your model predicts temperatures a below absolute zero. I knew the ice ages were cold, but that is perhaps a tad excessive. Predicting forwards, we can determine how long it will be before the oceans boil.
It’s true that if we know the physical system and its drivers, we can model and perhaps predict outcomes. However when the system is chaotic and no hypothesis can explain an observed phenomena, it’s reasonable to assume the phenomena will remain intact.
There is no convincing hypothesis that explains the warming since the little ice age. They all have flaws or lack observable conformation. Faced with no identifiable cause of affect the trend remains a continuum.
The study takes the past and projects its apparent pattern into the future and is valid while no conditions change. Of course condations are and do change.
The model then is used to identify and quantify any event that may cause the model to deviate from it’s projected path.
Of course the phenomena may be just a product of our own desire to find meaningful patterns in random events.

Buddenbrook
April 25, 2010 5:22 pm

This article seems to be ideologically motivated nonsense. It is based in no physics. To expect such short cycles to give us prediction value is a weak argument. Look at the years 800-2000, do the cycles apply? NO. And on top of that the mathematical model (or whatever it should be called) is based on the very bogus “consensus” temperature data, that has come under criticism in the skeptic blogosphere. People who have praised the article seem to do so only because it gives them the right, i.e. desired, pre-concieved answers. This is called a confirmation bias and has got nothing to do with valid science. In fact this article is an example of the type of corrupted post-modern science, ala Mike Hulme, in which pre-concieved answers are written into the process and methods which then in effect merely posture as science.
Posturing here in vain I must add as most of the skeptics here seem to be critical of this article. And this is positive.
The OP shows his motives and colours with comments like these:
“Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming.”
(What has that got to do with the model???) We should be especially proud of the two (I and II) 20th century ‘fossil fuelled’ global spectacles.
“the extremely irrational concept that the gas CO2, a plant food, i.e. foundation of life, is a pollutant because it causes catastrophic global warming.”
There are many things that are foundations of good things, but in excess would be very harmful things. So the concept isn’t irrational, in fact it is a plausible hypothesis. To claim it irrational is irrational. It’s just that the catastrophic scenarios built on that hypothesis do not seem to be strongly supported by the DATA nor by any considerable body of transparent high quality research. But the hypothesis in itself is plausible.
And you can’t deduce the weakness of that hypothesis from any “general” principles like “CO2 plays a part in many good things and therefore cannot do anything harmful to life on earth”. That is so aristotelian. Get on with the times, the Newtonian revolution didn’t happen yesterday.
if Michael Mann or Gavin Schmidt had written an article like this, but arrived at different results predicting catastrophic warming, ‘a McIntyre’ would tear it into a thousand shreds.
And if they added in between lines “mother earth” type of poetry to kind of support their arguments which is the equivalent “CO2 the foundation of life” you would see no end to the laughter.
Skeptics should not fall victims to the confirmation bias that is plaguing the ‘warmist research’ and jump on to support anything as long as it gives the desired answers and then praise it as good science. That would be embarassing and counterproductive in the end.

J.Hansford
April 25, 2010 5:43 pm

stevengoddard (12:37:38) :
Both HadCrut and GISS show a good correlation between CO2 and temperature.
———————————————————
That’s because HadCrut and GISS are designed to have a correlation….. It’s Claytons science. The science you do when you aren’t doing science;-)

April 25, 2010 5:48 pm

Mike McMillan (14:55:02) :

Sorry, but we English get a little fed up about the corruption of the language.

Yeah, I used to too. Then I discovered that a great deal of the US spelling, and even pronunciation, is merely ‘old’ English. What seems to me to have happened is that the UK has moved on, with a strong European influence, and the US has remained steadfast. They still use ‘our’ imperial measurements, too.
So the US have not corrupted ‘our’ language, they have just kept and refined ‘their own’ language. ‘Our’ language has changed in different ways.
Vive la difference!

April 25, 2010 5:50 pm

Sorry, last post should have been addressed to The Ghost Of Big Jim Cooley (13:20:37) :

MartinGAtkins
April 25, 2010 5:52 pm

stevengoddard (14:59:59) :
dT/dt / dCO2/dt = dT/dCO2
dt has no value. Data bunching is indicated.

BarryW
April 25, 2010 6:02 pm

While the equation does not explain why it is happening it does point to a probable “cyclic” phenomena that is not addressed by the GCMs. A number of people (myself included when I played with the temp data and R) have seen this in the data. The linear rates that are espoused by the CAGW proponents are most likely too high. This is oblivious by the 21st Century divergence from their projections. If the cyclic nature of the temperatures has any validity we should see a lowering of temperatures on the order of .3 or .4 deg over the next 20 years before they start rising again.

AC Adelaide
April 25, 2010 6:11 pm

Nice comment RACookPE1978, Clearly this model shows the longer term trend is not linked to CO2 and presents a more realistic two part break up to the trend. Great start. Now it is time, as you suggest, to expand the time horizon as you suggest and see how that trend works backwards and secondly to work out the “why” of what causes these two underlying trends. OK PDO is a start. The mathematical model gives us an insight as to where we might be going even if we dont know the “why”. The theory of “what” is the first important step in any theory of “why”. The problem with the AGWers was that they started with a theory of “why” (rising CO2) and allowed that to drive their mathematical modelling, which then forced them into data tampering to make their model work. They are not going to abandon their “why” until they are made to focus on the “what”. This simple model puts it front and centre for them to explain in the context of rising CO2.

Richard M
April 25, 2010 6:23 pm

stevengoddard (16:53:01) :
I tried out a stock fitting program which very accurately developed a fifth order equation for modeling past behaviour, but had no skill at predicting future behaviour.
Oh, it probably did. It’s just the time scale was so small that it was not useful. A week of stock market trading is probably equivalent to a 100 years of climate (or more),

April 25, 2010 6:32 pm

John Cooke (13:13:35) : – Exactly correct.
Dr. Orssengo,
First, I commend you for this undertaking, and to Anthony for posting this, and offer some comments based on my experience in similar modeling efforts, though in a different field.
I have approximately 25 years experience with both correlation models, and first principles models of complex heat-and-material-balanced, plus kinetic equations, for petroleum refining and petrochemical processes.
First principles models begin with known physics, and when performed properly, can be used to extrapolate into the future. These must include all variables, and correct equations for their behavior, which the GCMs clearly do not. Hence the GCMs are wrong at this time, and will remain wrong until they comply with the fundamentals.
Correlation models (as this one) take existing data, fit a curve to it, and can have pretty good fit. But these are often quite useless as predictors of future behavior, exactly as John Cooke stated at 13:13:35.
Model builders in refineries and petrochemical plants have known these basic truths for decades. They were forced to use correlation models for many years, until the first principles models finally became available and robust.
There are several unknowns that could influence the future GMTA, in particular more or fewer volcanoes, or volcanoes with greater or lesser amounts of particulates and/or sulfates, greater or lesser cloud coverage (no matter the origin, cosmic rays or otherwise), to name only two.
Also, if I may, Dr. Orssengo, there appears to be a better correlation model for the GMTA historical data. As shown in your Figure 4, the linear fit line does not pass through data points (0,0), (0.2,0.2), (0.4,0.4), etc. This indicates that the correlation model will fail, with greater and greater error as one predicts farther into the future.
A brief inspection of Figure 4 shows that there appears to be some error in the data points at the lower left, for all data points with values less than -0.4 on the X-axis. These data points, if removed from the graph, would appear to allow the trend line to pass very close to or through the points mentioned just above, (0,0), etc. It appears that the problem exists with the twin peaks around 1895 – 1900 in your Figure 3.

Charles Higley
April 25, 2010 6:37 pm

There was a comment somewhere near the bottom here which mentioned that the longer term temperature changes should also be included. One of them mentions that the sinusoidal pattern may trend downward, a la the Little Ice Age.
Good point. Does it not bother anybody that the Holocene Optimum, Minoan, Roman, Medieval, and Modern Warm Periods are showing a clear downward trend?
Our next high will mostly like be cooler than our Modern high.

Dave McK
April 25, 2010 6:49 pm

I believe the author is doing what a person interested in extracting meaning from time series data would have done first thing.
Fourier transform.
Do NOT average.
Of course CO2 correlates, Telford. Effects always follow their causes – they never precede them.

April 25, 2010 7:02 pm

I think this analysis is useful in the near term to rule out any role of CO2 in climate ‘forcing’.
However, I have to agree with others who have posted stating that this model should not be extrapolated forward or backward in time beyond the next one or two cycles, and should not be relied on historically beyond one or two cycles from the start of this analysis in 1880.
I think this model is analogous to a 2D formula correctly describing a brief projection of a 3D trend. Because this model makes no attempt to define the rising GMTA, but only describes its attributes as projected into the short term, there is no long term applicability here.
Still, the mathematical analysis is commendable, and the definition of 20th century temperatures can be derived from the formulas given.
As others have stated, though, this can’t be the true picture, as not too long ago absolute zero is reached, and not too far from now everything would be a plasma.

WestHoustonGeo
April 25, 2010 7:10 pm

Quoting:
“Sorry, but we English get a little fed up about the corruption of the language.”
Commenting:
We know. That’s why we do it. 😉

Dave N
April 25, 2010 7:11 pm

Those blasted observations are continuing to screw up the models! Someone do something about them already.. before it’s too late!!!

April 25, 2010 7:15 pm

MartinGAtkins (17:52:39) :
LOL

Steven mosher
April 25, 2010 7:23 pm

Roger Sowell (18:32:26) :
“First principles models begin with known physics, and when performed properly, can be used to extrapolate into the future. These must include all variables, and correct equations for their behavior, which the GCMs clearly do not. Hence the GCMs are wrong at this time, and will remain wrong until they comply with the fundamentals.”
physics models do not have to include all variables. They have to include the relevant variables for the purpose at hand. For example, I may have a very high fidelity physics simulation ( model) of an aircraft in flight, or a bomb as it drops. In both cases, I may opt to use a model with less fidelity, with variables missing provided by solution is within the tolerances that my task requires. For example, if I want to calculate the range of the aircraft, I can probably do without a wide variety of system parameters. I can even “estimate” or “project” what the atmosphere will be like ( a standard day for example ) without actually modelling the atmosphere. I can “forecast” the winds I expect to encounter ( head wind of 20 knots ) without modelling the exact spatio temporal aspects of it. For other tasks, this might not be good enough, for example I may need a gust model. The same goes for the physical process of dropping bombs and predicting where they will land. Close enough is sometimes good enough for the task at hand. Like many generalizations about models the idea that all variables must be present is not factually correct. The same goes for the equations used to define the physics. Again, it depends on the task. Finally, A GCM, just like any law of nature will always be “wrong” in the trivial sense that it doesnt predict EXACTLY. The question is, is the model skillful to its purpose. does it ‘work’
does it work better than a naive model.
in 1850 the global temperature was X. a naive forecast ( willis’ null hypothesis) would predict that the temp would be X today.
C02 content is not information in his model. In fact there is no information in the naive model. In a naive model ( there is just natural variation) if its X in 1850, then your prediction for 2010 will just be X +- some large number. Hard to go wrong will that Null. but even the simplest physical model that attributed some warming due to C02 would beat that model for skill. Skill is about power in a test, but I’m off topic now.

1 3 4 5 6 7 14
Verified by MonsterInsights