Predictions Of Global Mean Temperatures & IPCC Projections

Guest post by Girma Orssengo, B. Tech, MASc, PhD

The Intergovernmental Panel on Climate Change (IPCC) claims that human emission of CO2 causes catastrophic global warming. When such extraordinary claim is made, every one with background in science has to look at the data and verify whether the claim is justified or not. In this article, a mathematical model was developed that agrees with observed Global Mean Temperature Anomaly (GMTA), and its prediction shows global cooling by about 0.42 deg C until 2030. Also, comparison of observed increase in human emission of CO2 with increase in GMTA during the 20th century shows no relationship between the two. As a result, the claim by the IPCC of climate catastrophe is not supported by the data.

Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming. Fortunately, the global warming claim by the IPCC that “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario” [1] is not supported by observations as shown in Figure 1, which shows a plateau for the global mean temperature trend for the last decade.

.”]

Figure 1 also shows that the observed temperatures are even less than the IPCC projections for emission held constant at the 2000 level.

As a result, the statement we often hear from authorities like UN Secretary-General Ban Ki-moon that “climate change is accelerating at a much faster pace than was previously thought by scientists” [3] is incorrect.

Thanks for the release of private emails of climate scientists, we can now learn from their own words whether global warming “is accelerating at a much faster pace” or not. In an email dated 3-Jan-2009, Mike MacCracken wrote to Phil Jones, Folland and Chris [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong. Otherwise, the Skeptics will be all over us–the world is really cooling, the models are no good, etc. And all this just as the US is about ready to get serious on the issue.

We all, and you all in particular, need to be prepared.

Similarly, in an email dated 24-Oct-2008, Mick Kelly wrote to Phil Jones [5]:

Just updated my global temperature trend graphic for a public talk and noted that the level has really been quite stable since 2000 or so and 2008 doesn’t look too hot.

Be awkward if we went through a early 1940s type swing!

The above statements from the climategate emails conclusively prove that the widely used phrase by authorities in public that global warming “is accelerating at a much faster pace” is supported neither by climate scientists in private nor by the observed data.

Thanks also goes to the Climate Research Unit (CRU) of the Hadley Center for daring to publish global mean temperature data that is “quite stable since 2000”, which is contrary to IPCC projections of 0.2 deg C warming per decade. If the CRU had not done this, we would have been forced to swallow the extremely irrational concept that the gas CO2, a plant food, i.e. foundation of life, is a pollutant because it causes catastrophic global warming.

As IPCC’s “models are no good”, it is the objective of this article to develop a valid mathematical global mean temperature model based on observed temperature patterns.

Mathematical Model For The Global Mean Temperature Anomaly (GMTA) Based On Observed Temperature Patterns

The Global Mean Temperature Anomaly (GMTA) data from the Climate Research Unit (CRU) of the Hadley Center shown in Figure 2 will be used to develop the mathematical model. In this article, the observed GMTA data from the CRU are assumed to be valid.

Examination of Figure 2 shows that the globe is warming at a linear rate as shown by the least square trend central line given by the equation

Linear anomaly in deg C = 0.0059*(Year-1880) – 0.52 Equation 1

Figure 2 also shows that superimposed on this linear anomaly line there is an oscillating anomaly that gives the Global Mean Temperature Anomaly (GMTA) the characteristics summarized in Table 1.

Table 1. Characteristics of the observed Global Mean Temperature Anomaly (GMTA) shown in Figure 2.

From 1880s to 1910s

End of warming, plateau at –0.2 deg C & then cooling trend

From 1910s to 1940s

End of cooling, plateau at –0.6 deg C & then warming trend

From 1940s to 1970s

End of warming, plateau at 0.1 deg C & then cooling trend

From 1970s to 2000s

End of cooling, plateau at –0.3 deg C & then warming trend

From 2000s to 2030s

End of warming, plateau at 0.5 deg C & then ? trend

A mathematical model can be developed that satisfies the requirements listed in Table 1. If the model to be developed gives good approximation for the GMTA values at its turning points (plateaus) and the GMTA trends between its successive turning points as summarized in Table 1, the model may be used for prediction.

.”]

For the oscillating anomaly, the sinusoidal function cosine meets the requirements listed in Table 1. From Figure 2, the amplitude of the oscillating anomaly is given by the vertical distance in deg C from the central linear anomaly line to either the top or bottom parallel lines, and it is about 0.3 deg C. From Figure 2, the oscillating anomaly was at its maximum in the 1880s, 1940s, & 2000s; it was at its minimum in the 1910s and 1970s. The years between successive maxima or minima of the oscillating anomaly is the period of the cosine function, and it is about 1940–1880=1970–1910=60 years. For the cosine function, once its amplitude of 0.3 deg C and its period of 60 years are determined, the mathematical equation for the oscillating anomaly, for the years starting from 1880, can be written as

Oscillating anomaly in deg C = 0.3*Cos(((Year-1880)/60)*2*3.1416) Equation 2

In the above equation, the factor 2*3.1416 is used to convert the argument of the cosine function to radians, which is required for computation in Microsoft Excel. If the angle required is in degrees, replace 2*3.1416 with 360.

Combining the linear anomaly given by Equation 1 and the oscillating anomaly given by Equation 2 gives the equation for the Global Mean Temperature Anomaly (GMTA) in deg C for the years since 1880 as

GMTA = 0.0059*(Year-1880) – 0.52 + 0.3*Cos(((Year-1880)/60)*2*3.1416) Equation 3

The validity of this model may be verified by comparing its estimate with observed values at the GMTA turning points as summarized in Table 2.

Table 2. Comparison of the model with observations for GMTA in deg C at its turning points.

Year

Observed (Table 1)

Model

(Equation 3)

Warming plateau for the 1880s

-0.2

-0.22

Cooling plateau for the 1910s

-0.6

-0.64

Warming plateau for the 1940s

+0.1

+0.13

Cooling plateau for the 1970s

-0.3

-0.29

Warming plateau for the 2000s

+0.5

+0.48

Table 2 shows excellent agreement for the GMTA values between observation and mathematical model for all observed GMTA turning points.

A graph of the GMTA model given by Equation 3 is shown in Figure 3, which includes the observed GMTA and short-term IPCC projections for GMTA from 2000 to 2025. In addition to the verification shown in Table 2, Figure 3 shows good agreement for the GMTA trends throughout observed temperature records, so the model may be used for prediction. As a result, Figure 3 includes GMTA predictions until 2100, where the year and the corresponding GMTA values are given in parentheses for all the GMTA turning points.

As shown in Figure 3, a slight discrepancy exist between observed and model GMTA values at the end of the 1890s when the observed values were significantly warmer than the model pattern, and in the 1950s when the observed values were significantly colder than the model pattern.

Figure 3. Comparison of observed Global Yearly Mean Temperature Anomaly (GMTA) with models.

From the model in Figure 3, during the observed temperature record, there were two global warming phases. The first was from 1910 to 1940 with a warming of 0.13+0.64=0.77 deg C in 30 years. The second was from 1970 to 2000 with a warming of 0.48+0.29=0.77 deg C in 30 years. Note that both warming phases have an identical increase in GMTA of 0.77 deg C in 30 years, which gives an average warming rate of (0.77/30)*10=0.26 deg C per decade.

From the model in Figure 3, during the observed temperature record, there were two global cooling phases. The first was from 1880 to 1910 with a cooling of 0.64-0.22=0.42 deg C in 30 years. The second was from 1940 to 1970 with a cooling of 0.13+0.29=0.42 deg C in 30 years. Note that both cooling phases have an identical decrease in GMTA of 0.42 deg C in 30 years, which gives an average cooling rate of (0.42/30)*10=0.14 deg C per decade.

The above results for the normal ranges of GMTA determined from the model can also be calculated using simple geometry in Figure 2. In this figure, almost all observed GMTA values are enveloped by the two parallel lines that are 0.6 deg C apart. Therefore, as a first approximation, the normal range of GMTA is 0.6 deg C. From Figure 2, the period for a global warming or cooling phase is about 30 years. Therefore, as a first approximation, the normal rate of global warming or cooling is (0.6/30)*10=0.2 deg C per decade.

The above approximation of 0.6 deg C for the normal range of GMTA should be refined by including the effect of the linear warming anomaly given by Equation 1 of 0.006 deg C per year, which is the slope of the two envelope parallel lines in Figure 2. As the oscillating anomaly changes by 0.6 deg C in 30 years between its turning points, the linear anomaly increases by 0.006*30=0.18 deg C. Due to this persistent warming, instead of the GMTA increasing or decreasing by the same 0.6 deg C, it increases by 0.6+0.18=0.78 deg C during its warming phase, and decreases by 0.6–0.18=0.42 deg C during its cooling phase. As a result, the refined normal ranges of GMTA are 0.77 deg C in 30 years during its warming phase, and 0.42 deg C in 30 years during its cooling phase. These results for the normal ranges of GMTA obtained using simple geometry in Figure 2 agree with those obtained from the model in Figure 3.

Correlation of Model and Observed Global Mean Temperature Anomaly (GMTA)

In Table 2, data points for only five years were used to verify the validity of Equation 3 to model the observed data. However, it is important to verify how well the observed GMTA is modeled for any year.

Figure 4. Correlation between model and observed GMTA values. The model GMTA values are from Equation 3, and the observed GMTA values are from the Climate Research Unit shown in Figure 2.

How well the observed data is modeled can be established from a scatter plot of the observed and model GMTA values as shown in Figure 4. For example, for year 1998, the observed GMTA was 0.53 deg C and the model GMTA is 0.47 deg C. In Figure 4, for year 1998, the pair (0.47,0.53) is plotted as a dot. In a similar manner, all the paired data for model and observed GMTA values for years from 1880 to 2009 are plotted as shown in Figure 4.

Figure 4 shows a strong linear relationship (correlation coefficient, r=0.88) between the model and observed GMTA. With high correlation coefficient of 0.88, Figure 4 shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. The positive slope of the trend line indicates a positive relationship between model and observed GMTA. That is, global cooling from the model indicates observed global cooling, and global warming from the model indicates observed global warming.

Global Mean Temperature Prediction Calculations

The following patterns may be inferred from the graph of the Global Mean Temperature Anomaly (GMTA) model shown in Figure 3 for the data from the Climate Research Unit of the Hadley Center [2]:

  1. Year 1880 was the start of a cooling phase and had a GMTA of –0.22 deg C.

  2. During the global cooling phase, the GMTA decreases by 0.42 deg C in 30 years.

  3. Global cooling and warming phases alternate with each other.

  4. During the global warming phase, the GMTA increases by 0.77 deg C in 30 years.

The patterns in the list above are sufficient to estimate the GMTA values at all of its turning points since 1880.

For example, as year 1880 with GMTA of –0.22 deg C was the start of a cooling phase of 0.42 deg C in 30 years, the next GMTA turning point was near 1880+30=1910 with GMTA of –0.22–0.42=-0.64 deg C. This GMTA value for 1910 is shown as (1910,-0.64) in Figure 3.

As year 1910 with GMTA of –0.64 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1910+30=1940 with GMTA of 0.77–0.64=0.13 deg C. This GMTA value for 1940 is shown as (1940,0.13) in Figure 3.

As year 1940 with GMTA of 0.13 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point was near 1940+30=1970 with GMTA of 0.13–0.42=-0.29 deg C. This GMTA value for 1970 is shown as (1970,-0.29) in Figure 3.

As year 1970 with GMTA of -0.29 deg C was the end of a global cooling phase, it is also the start of a global warming phase of 0.77 deg C in 30 years. As a result, the next GMTA turning point was near 1970+30=2000 with GMTA of 0.77–0.29=0.48 deg C. This GMTA value for 2000 is shown as (2000,0.48) in Figure 3.

As the GMTA values calculated above using the global temperature patterns listed at the beginning of this section give good approximation of observed GMTA values at all GMTA turning points (1880, 1910, 1940, 1970 & 2000), it is reasonable to assume that the patterns may also be used for prediction.

As a result, as year 2000 with GMTA of 0.48 deg C was the end of a global warming phase, it is also the start of a global cooling phase of 0.42 deg C in 30 years. As a result, the next GMTA turning point will be near 2000+30=2030 with GMTA of 0.48–0.42=0.06 deg C. This GMTA value for 2030 is shown as (2030,0.06) in Figure 3.

In a similar manner, the GMTA values for the remaining GMTA turning points for this century can be calculated, and the results are shown in Figure 3.

Figure 3 shows a very interesting result that for the 20th century, the global warming from 1910 to 2000 was 0.48+0.64=1.12 deg C. In contrast, for the 21st century, the change in GMTA from 2000 to 2090 will be only 0.41–0.48=-0.07 deg C. This means that there will be little change in the GMTA for the 21st century! Why?

Why Does The Same Model Give A Global Warming Of About 1 deg C For The 20th Century But Nearly None For The 21st Century?

According to the data shown in Figure 3, it is true that the global warming of the 20th century was unprecedented. As a result, it is true that the corresponding sea level rise, melting of sea ice or the corresponding climate change in general were unprecedented. However, this was because the century started when the oscillating anomaly was at its minimum near 1910 with GMTA of –0.64 deg C and ended when it was at its maximum near 2000 with GMTA of 0.48 deg C, giving a large global warming of 0.48+0.64=1.12 deg C. This large warming was due to the rare events of two global warming phases of 0.77 deg C each but only one cooling phase of 0.44 deg C occurring in the 20th century, giving a global warming of 2*0.77-0.42=1.12 deg C.

In contrast to the 20th century, from Figure 3, there will be nearly no change in GMTA in the 21st century. This is because the century started when the oscillating anomaly was at its maximum near 2000 with GMTA of 0.48 deg C and will end when it is at its minimum near 2090 with GMTA of 0.41 deg C, giving a negligible change in GMTA of 0.41-0.48=-0.07 deg C. This negligible change in GMTA is due to the rare events of two global cooling phases of 0.42 deg C each but only one warming phase of 0.77 deg C occurring in the 21st century, giving the negligible change in GMTA of 0.77-2*0.42=-0.07 deg C. Note that this little change in GMTA for the 21st century is identical to that from 1880 to 1970, which makes the global warming from 1970 to 2000 by 0.77 deg C appear to be abnormally high.

If the period for a century had been 120 years, we wouldn’t have this conundrum of nearly 1 deg C warming in the 20th century but nearly none in the next!

Ocean Current Cycles

One of the most important variables that affect global mean surface temperature is ocean current cycles. The rising of cold water from the bottom of the sea to its surface results in colder global mean surface temperature; weakening of this movement results in warmer global mean surface temperature. Various ocean cycles have been identified. The most relevant to global mean temperature turning points is the 20 to 30 years long ocean cycle called Pacific Decadal Oscillation (PDO) [6]:

Several independent studies find evidence for just two full PDO cycles in the past century: “cool” PDO regimes prevailed from 1890-1924 and again from 1947-1976, while “warm” PDO regimes dominated from 1925-1946 and from 1977 through (at least) the mid-1990’s (Mantua et al. 1997, Minobe 1997).

These cool and warm PDO regimes correlate well with the cooling and warming phases of GMTA shown in Figure 3.

The model in Figure 3 predicts global cooling until 2030. This result is also supported by shifts in PDO that occurred at the end of the last century, which is expected to result in global cooling until about 2030 [7].

Effect Of CO2 Emission On Global Mean Temperature

Examination of Figure 3 shows that the Global Mean Temperature Anomaly (GMTA) for 1940 of 0.13 deg C is greater than that for 1880 of –0.22 deg C. Also, the GMTA for 2000 of 0.48 deg C is greater than that for 1940 of 0.13 deg C. This means that the GMTA value, when the oscillating anomaly is at its maximum, increases in every new cycle. Is this global warming caused by human emission of CO2?

The data required to establish the effect of CO2 emission on global mean temperature already exist. The global mean temperature data are available from the Climate Research Unit of the Hadley Centre shown in Figure 3, and the CO2 emission data are available from the Carbon Dioxide Information Analysis Centre [8]. For the period from 1880 to 1940, the average emission of CO2 was about 0.8 G-ton, and the increase in the GMTA was 0.13+0.22=0.35 deg C. For the period from 1940 to 2000, the average emission of CO2 was about 4 G-ton, but the increase in GMTA was the same 0.48-0.13=0.35 deg C. This means that an increase in CO2 emission by 4/0.8=5-fold has no effect in the increase in the GMTA. This conclusively proves that the effect of 20th century human emission of CO2 on global mean temperature is nil.

Note that the increase in GMTA of 0.35 deg C from 1880 to 1940 (or from 1940 to 2000) in a 60 year period has a warming rate of 0.35/60=0.0058 deg per year, which is the slope of the linear anomaly given by Equation 1. As a result, the linear anomaly is not affected by CO2 emission. Obviously, as the oscillating anomaly is cyclic, it is not related to the 5-fold increase in human emission of CO2.

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil.

Further evidence for the non-existent relationship between CO2 and GMTA is IPCC’s projection of a global warming of 0.2 deg C per decade, while the observed GMTA trend was “quite stable since 2000” [5]. The evidence will be “unequivocal” if global cooling by about 0.42 deg C starts soon and continues until about 2030, as shown by the model in Figure 3. The IPCC projection for the GMTA for 2020 is 0.8 deg C, while the prediction from the model for this value is 0.2 deg C, a large discrepancy of 0.6 deg C. If this global cooling is confirmed, it will then be time to bury the theory that CO2, a plant food, causes catastrophic global warming. Fortunately, we don’t have to wait too long for the burial. Less than ten years. It will be cheering news!

IPCC Projections

According to the IPCC [1], “For the next two decades, a warming of about 0.2°C per decade is projected for a range of SRES emission scenario.”

IPCC explains this projection as shown in Figure 5 where GMTA trend lines were drawn for four periods from 2005 to 1856, 1906, 1956 & 1981. These trend lines give increasing warming rate from a low value of 0.045 deg C per decade for the RED trend line for the first period from 1856 to 2005, to a greater value of 0.074 deg C per decade for the PURPLE trend line for the second period from 1906 to 2005, to a still greater value of 0.128 deg C per decade for the ORANGE trend line for the third period from 1956 to 2005, and to a maximum value of 0.177 deg C per decade for the YELLOW trend line for the fourth period from 1981 to 2005. IPCC then concludes, “Note that for shorter recent periods, the slope is greater, indicating accelerated warming” [9].

If this IPCC interpretation is correct, catastrophic global warming is imminent, and it is justified for the world to be griped by fear of global warming. However, is IPCC’s “accelerated warming” conclusion shown in Figure 5 correct?

What the GMTA pattern in Figure 3 shows is that it has cooling and warming phases. As a result, in Figure 5, comparing the warming rate of one period that has only one warming phase with another period that has a combination of warming and cooling phases will obviously show the maximum warming rate for the first period. This is comparing apples to oranges.

Comparing apples to apples is to compare two periods that have the same number of cooling and/or warming phases.

.”]

One example of comparing apples to apples is to compare one period that has one warming phase with another that also has one warming phase. From Figure 3, two 30-year periods that have only one warming phase are the periods from 1910 to 1940 and from 1970 to 2000. For the period from 1910 to 1940, the increase in GMTA was 0.13+0.64=0.77 deg C, giving a warming rate of (0.77/30)*10=0.26 deg C per decade. Similarly, for the period from 1970 to 2000, the increase in GMTA was 0.48+0.29=0.77 deg C, giving an identical warming rate of 0.26 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1970 to 2000 compared to the period from 1910 to 1940.

A second example of comparing apples to apples is to compare one period that has one cooling and warming phases with another that also has one cooling and warming phases. From Figure 3, two 60-year periods that have only one cooling and warming phases are the periods from 1880 to 1940 and from 1940 to 2000. For the period from 1880 to 1940, the increase in GMTA was 0.13+0.22=0.35 deg C, giving a warming rate of (0.35/60)*10=0.06 deg C per decade. Similarly, for the period from 1940 to 2000, the increase in GMTA was 0.48-0.13=0.35 deg C, giving an identical warming rate of 0.06 deg C per decade. Therefore, there is no “accelerated warming” in the period from 1940 to 2000 compared to the period from 1880 to 1940.

From the above analysis, IPCC’s conclusion of “accelerated warming” is incorrect, and its graph shown in Figure 5 is an incorrect interpretation of the data.

Based on observed GMTA pattern shown in Figure 3, a global warming phase lasts for 30 years, and it is followed by global cooling. As a result, the recent global warming phase that started in the 1970s ended in the 2000s as shown by the current GMTA plateau, and global cooling should follow. Therefore, IPCC’s projection for global warming of 0.2 deg C per decade for the next two decades is incorrect. Also, divergence between IPCC projections and observed values for the GMTA has started to be “discernible” since 2005 as shown in Figure 3.

According to the Occam’s Razor principle, given a choice between two explanations, choose the simplest one that requires the fewest assumptions. Instead of applying the Occam’s Razor principle by assuming the cause of GMTA turning points to be natural, the IPCC assumed the cause to be man-made [9]:

From about 1940 to 1970 the increasing industrialisation following World War II increased pollution in the Northern Hemisphere, contributing to cooling, and increases in carbon dioxide and other greenhouse gases dominate the observed warming after the mid-1970s.

Like in the 1880s & 1910s, what if the causes of the GMTA turning points in the 1940s and 1970s were also natural?

Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-fold increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil. Also, IPCC’s conclusion of “accelerated warming” shown in Figure 5 is incorrect.

What is the cause of the GMTA turning point from warming to plateau in the 2000s? Here is the suggestion by Mike MacCracken [4]:

I think we have been too readily explaining the slow changes over past decade as a result of variability–that explanation is wearing thin. I would just suggest, as a backup to your prediction, that you also do some checking on the sulfate issue, just so you might have a quantified explanation in case the prediction is wrong.

According to the IPCC and the above suggestion, the 1940 GMTA turning point from global warming to cooling was caused by sulfates, the 1970 GMTA turning point from cooling to warming was caused by carbon dioxide, and the 2000 GMTA turning point from warming to plateau was caused by sulfates. It is interesting to note that sulfate and carbon dioxide gave the globe a 30-year alternate cooling and warming phases from 1940 to 2000. This is just absurd.

Instead of saying, “Be awkward if we went through a early 1940s type swing!” in private, but global warming “is accelerating at a much faster pace” in public, please release the world from the fear of climate catastrophe from use of fossil fuels, as this catastrophe is not supported by your own data. It is extremely callous not to do so.

Is the theory that “human emission of CO2 causes catastrophic global warming” one of the greatest blunders or something worse of “science”? We will find the unambiguous answer within the next ten years. Hope they don’t succeed in calling the plant food a pollutant and tax us before then.

==========================================

This document is also available as a PDF file, link below:

Predictions Of GMT

For any criticism, please leave a comment below, or contact me at orssengo@lycos.com

Girma J Orssengo

Bachelor of Technology in Mechanical Engineering, University of Calicut, Calicut, India

Master of Applied Science, University of British Columbia, Vancouver, Canada

Doctor of Philosophy, University of New South Wales, Sydney, Australia

===========================================

REFERENCES

[1] IPCC Fourth Assessment Report: Climate Change 2007

a warming of about 0.2°C per decade is projected”

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/spmsspm-projections-of.html

[2] Observed Global Mean Surface Temperatures from the Climate Research Unit of the Hadley Center.

http://www.woodfortrees.org/plot/hadcrut3vgl/compress:12/from:1880/plot/hadcrut3vgl/from:1880/trend/plot/hadcrut3vgl/from:1880/trend/offset:0.3/plot/hadcrut3vgl/from:1880/trend/offset:-0.3

[3] Climate Change Science Compendium 2009

is accelerating at a much faster pace”

http://www.unep.org/pdf/ccScienceCompendium2009/cc_ScienceCompendium2009_full_en.pdf

[4] Climategate Email from Mike MacCracken to Phil Jones, Folland and Chris

that explanation is wearing thin”

http://www.eastangliaemails.com/emails.php?eid=947&filename=1231166089.txt

[5] Climategate Email from Mick Kelly to Phil Jones

Be awkward if we went through a early 1940s type swing!

http://www.eastangliaemails.com/emails.php?eid=927&filename=1225026120.txt

[6] The Pacific Decadal Oscillation (PDO)

http://jisao.washington.edu/pdo/

[7] Pacific Ocean Showing Signs of Major Shifts in the Climate

http://www.nytimes.com/library/national/science/012000sci-environ-climate.html

[8] Carbon Dioxide Information Analysis Center

Global CO2 Emissions from Fossil-Fuel Burning, Cement Manufacture, and Gas Flaring

http://cdiac.ornl.gov/ftp/ndp030/global.1751_2006.ems

[9] Climate Change 2007: Working Group I: The Physical Science Basis

How are Temperatures on Earth Changing?

http://www.ipcc.ch/publications_and_data/ar4/wg1/en/faq-3-1.html

0 0 votes
Article Rating
347 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Graham Jay
April 25, 2010 12:09 pm

Slightly off-topic
From Today’s telegraph:
Dinosaurs died from sudden temperature drop ‘not comet strike’, scientists claim
http://www.telegraph.co.uk/science/dinosaurs/7624014/Dinosaurs-died-from-sudden-temperature-drop-not-comet-strike-scientists-claim.html

Nigel Harris
April 25, 2010 12:14 pm

Bwah ha ha ha ha ha!
(You cannot expect to be taken seriously!)

Lon Hocker
April 25, 2010 12:16 pm

I’m skeptical…

David Ball
April 25, 2010 12:19 pm

When they said ” It is worse than we thought”, it was in regards to their predictions and modelling capabilities.

April 25, 2010 12:19 pm

What most of us thought, but don’t hold your breath waiting for “the media” to pick it up and run with it.

David Ball
April 25, 2010 12:24 pm

The author also attended UBC. Suzuki’s alma mater. Wonder how he/she managed to fly under the radar there? Great article and thank you.

Milwaukee Bob
April 25, 2010 12:26 pm

“Hope they don’t succeed in calling the plant food a pollutant and tax us before then.”
WOW! Now THAT’S a post! It’ll take a while (for me) to sort thru the details and a first glance suggestion would be a “talking points summary” for the less…. well, the media and policy makers.
AND unfortunately I think the (money) train has long ago left the station regarding the last statement (above) in your post. Tooo many BIG WIGs have a lot of $$ invested in CO2 offsets (trades) and the “industry” thereof. If Cap & Trade were NOT to happen, a lot of important people will be ticked! And out bundles!

April 25, 2010 12:29 pm

Does anyone have any current address for the Central England Temperature (CET) series please?
No joke – all the sites (HadMET, CRU, etc) have been down for a while.

Hockeystickler
April 25, 2010 12:32 pm

It looks like natural cycles of about 30 years one way (up or down), 60 years both ways ; perhaps if we can get away from the politics of co2, real scientists can figure out why.

HotRod
April 25, 2010 12:34 pm

Well, I was going to be quite interested in this piece until I read ‘CO2, a plant food, i.e. foundation of life’ early on. er, what relevance to a piece about temperature?

April 25, 2010 12:36 pm

Excellent research and superb exposition. My only worry is as follows:
if the Cru data base is biased upward due to the increasing UHI effect not being corrected for, this research suggests that instead of 30-yr warming then cooling cycles, we are actually heading toward 30-yr cooling then 30-yr increased cooling periods.
Or have I missed something?

April 25, 2010 12:37 pm

Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm
We might expect to see an acceleration in temperature growth over the next century, coming in at the low end of current IPCC estimates and well below Hansen’s 1988 predictions.

April 25, 2010 12:42 pm

Arctic ice concentrations are today the largest in nine years. Arctic ice grew until March 31st, the latest ever recorded. Arctic ice is thicker than in 1980. The northern hemisphere had one of the coldest and snowiest winters on record. Solar activity remains low, we have had nine consecutive days without sunspots. The oceans might soon move towards La Nina. The Katla volcano in Iceland might soon become active.
Yes indeed, I would expect the next ten years will determine if the theory of man made global warming is correct or not. In fact, I would take a wager on this one.

David Mayhew
April 25, 2010 12:44 pm

“Climate Research Unit (CRU) of the Hadley Center”
Surely these are separate institutions ? The CRU is part of the University of East Anglia and the Hadley Centre is part of the UK Meteorological Office. Important IMHO for overall credibility of this article. You may wish to edit it (and then this note is unnecessary and can be removed).

DirkH
April 25, 2010 12:51 pm

“stevengoddard (12:37:38) :
Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm

When you take two time series that both go upwards during the same time interval and do a scatterplot over them they always correlate. As the CO2 goes up about linearly with time since about 1950, we can see in your graphs that the x axis is basically the time axis, at least for the higher CO2 values.
Now take a close look at the HadCRUT3 vs CO2: Do you notice the slight drop in the temperatures near the right end? This is exactly the last 10 “flat” years and might be the beginning of the cooling.

Jimbo
April 25, 2010 12:52 pm

Here is an illustration with models V reality and theory being put into serious question, just like AGW.
23 April, 2010 – NASA JPL News Relaease
“NASA’s Spitzer Space Telescope has discovered something odd about a distant planet — it lacks methane…”
“Models tell us that the carbon in this planet should be in the form of methane. Theorists are going to be quite busy trying to figure this one out.”
and
17 April, 2010 – University of California Santa Barbara
“The discovery of nine new planets challenges the reigning theory of the formation of planets, according to new observations by astronomers. ”
Sources:
http://www.jpl.nasa.gov/news/news.cfm?release=2010-137
http://www.ia.ucsb.edu/pa/display.aspx?pkey=2220
http://www.astrobio.net/pressrelease/3472/a-mystery-of-missing-methane
http://www.astrobio.net/pressrelease/3465/opposite-orbits-upend-theories

Andrew30
April 25, 2010 12:53 pm

geoff pohanka (12:42:36)
“Arctic ice concentrations are today the largest in nine years. ”
And have been that way for about 25 days.
And the area coverage is currently about 1.3 times the area of Canada (2nd largest country on Earth).

Tucker
April 25, 2010 12:54 pm

Wonderful article. Very informative and actually much more believable than the report produced by the IPCC at billions of dollars. The only issue I take exception to is the fact that the author has made a dangerous assumption that past conditions remain the same into the future. That is, the cooling will be at -0.42C per cycle, and the warming at 0.77C per cycle. It seems rather intuitive that those numbers are themselves cyclical, or else we’d find ourselves either in a sauna or a block of ice before long.

janama
April 25, 2010 12:56 pm

Henry Galt (12:29:59) :
you can find it here.
http://www.climate4you.com/

David Ball
April 25, 2010 12:57 pm

stevengoddard (12:37:38) :”We might expect to see an acceleration in temperature growth over the next century, coming in at the low end of current IPCC estimates and well below Hansen’s 1988 predictions”.- Right smack dab in the middle of natural variability. Whodathunkit.

Rod Smith
April 25, 2010 12:58 pm

I don’t suppose we need to “sweat it!”

Jimbo
April 25, 2010 12:58 pm
Hockeystickler
April 25, 2010 12:59 pm

stevengoddard- “Both HadCrut and GISS show a good correlation between CO2 and temperature”- of course they do, but are they anywhere near accurate ?

Peter Miller
April 25, 2010 1:00 pm

As a mathematical model, this makes much more sense than all the alarmist climate models, as none of the latter can be made to fit the known reality of the past.
Nevertheless, it is a mathematical model and nature can be a real bitch.
Of course, it will be ignored or vilified by the Establishment, as it is not supportive of the many billions of dollars of taxpayer funds currently being spent to highlight and ‘prove’ a threat that does not exist.
However, it is another small nail in the huge coffin of bogus AGW theory and therefore we must ensure it has as wide a distribution as possible.

April 25, 2010 1:01 pm

Any bets on whether the progressive increase in the GMTA maxima are, at least partially, the result of UHI?

rbateman
April 25, 2010 1:03 pm

Looks good except that the GMTA takes a turn upward going backwards from 1880 to 1870, and comes out of the upper range of oscillation.
Just as the warm/cool phases of PDO/AMO run in a sine, so does the GMTA.
Climate as seen through GMTA will not stay linear, and it must rock and roll.
Great 1st step, but only a 1st step.
Now, go back to your raw CRU 91/94/99 data and pick out that which was not modified with hot adjusters, then look to identify those overarching rolls. Which way now? You may have to screen out UHI affected sites, and replot as much rural/non-UHI as there is.
Contrast that with UHI affected added back in, and you have man’s footprint:
Land use issues.
Nature abhors straight lines and vacuums.
Line trends will not do for long-term prediction outside of the upslope and downslope of cycles.
Until models can contain equations that account for and utilize rolling sines of many durations, they are but scratches on a canvas.

Hu Duck Xing
April 25, 2010 1:07 pm

Excellent! Clear 30 year cycles.

Invariant
April 25, 2010 1:07 pm

geoff pohanka (12:42:36) : The Katla volcano in Iceland might soon become active.
Right! This decade may be influenced by:
1. transition from positive to negative PDO, AO and NAO
2. transition from El Nino to La Nina
3. transition from strong to weak solar cycles
4. transition from silent to violent volcanos
Still, for temperature, we know the AGW message, The Only Way Is Up!

roger
April 25, 2010 1:08 pm

Henry Galt
they are indeed unobtainable. This has ocurred before, and usually happens when results fail to match up to their expectations. As it would now take a phenominal rise in average temps for the rest of the year to offset the cool figures realised thus far, let alone to produce the new record annual temperature foretold in their fairy books, they have lost interest and are emulating the sun by having a grand sulk in the universe (sic) of Exeter.

UK John
April 25, 2010 1:10 pm

Every future prediction will be wrong that is the only certainty.

rbateman
April 25, 2010 1:12 pm

Alexander (12:36:19) :
Simply compare the versions of Phil Jones CRU 91, 94 and 99 to see what was changed, what was not. Phil did a lot of work, some of it quite good.
We need to hear from Phil Jones on what went into those set changes.
We also need to hear from sources such as the AMA, which may or may not have some of the records that Phil Jones had access to, but are no longer to be accounted for.

John Cooke
April 25, 2010 1:13 pm

As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
As such, any predictions made with it are purely based on the (relatively recent) past observational history. It makes no attempt to understand the underlying physics.
This lack of a proper understanding of the physics involved (including a lot of stuff that apparently is not included in climate models at present, such as solar activity effects) means that predictions made with it are not particularly helpful. If the parameters change, such as (for example) a change in solar activity that does not fit the pattern of recent years, then the ‘model’ has no way to take account of this.
In this whole field, there’s far too much reliance on making predictions from recent patterns rather than on trying to understand what’s really going on. OK, politicians (and the public) like things to be nice and clean and clear, but sorry, the real world isn’t like that.

Dr T G Watkins
April 25, 2010 1:14 pm

Another 90 mins. of enjoyable reading at WUWT.(It took me that long to follow).
Go, Girma, go. It seems to be an excellent analysis of the published CRU data and I could follow your maths pretty well although I’m not clever enough to be truly critical. Well done for sharing with the less gifted (me).
I look forward to other comments on this site. I’m sure they’ll be complimentary.

H.R.
April 25, 2010 1:16 pm

“[…] As a result, the statement we often hear from authorities like UN Secretary-General Ban Ki-moon that “climate change is accelerating at a much faster pace than was previously thought by scientists” [3] is incorrect. […]”
Thank you, Girma Orssengo (and Anthony). It is a pleasure to read your guest post.
Not only are you astute, but you are very polite. (I would have been snipped for my assessment of Secretary-General Ban Ki-moon’s statement.)

The Ghost Of Big Jim Cooley
April 25, 2010 1:20 pm

Sorry, I’m not one to usually nitpick, but the Hadley Centre is a noun (proper). So it CANNOT be Hadley Center – you can’t just change it to the American spelling. It is Hadley CenTRE, not Center. Sorry, but we English get a little fed up about the corruption of the language. We accept other countries adapting it, but not when they’re stating something that’s actually here. Hence, the ‘British Tyre Company’ could never be ‘The British Tire Company’.

The Ghost Of Big Jim Cooley
April 25, 2010 1:22 pm

Henry Galt, no all the Met Office websites are down.

Richard Telford
April 25, 2010 1:23 pm

Your model is very useful. Extrapolating back in time to the early part of the last ice age, your model predicts temperatures a below absolute zero. I knew the ice ages were cold, but that is perhaps a tad excessive. Predicting forwards, we can determine how long it will be before the oceans boil.
Your model describes the data passably, but has no predictive power, other than that of predicting your credibility – absolute zero.
Useful models try to incorporate the physics of the system. Numerology is not physics.

April 25, 2010 1:28 pm

As long as your using bogus temperature readings to make your hypothesis you will never have a valid hypothesis. 1934 was warmer than 1998, thus the whole idea is bunk and garbage. It seems strange that the skeptics would indulge the alarmist warmists by using garbage data.

WAM
April 25, 2010 1:28 pm

It seems to be a LSQ fit, using superposition of oscillatory and linear trends. It suggests that the temperature will grow without bounds (so why this thiscussion after the derivation of the approximations?).
There was a lot of discussion on the blog by Bart Verheggen, where VS has shown that LSQ fit cannot be used for finding trends in temperature records. A bit more complex statistical approximations are suitable for this. And these models are capable of modelling the observed temperature signal (stochastic fit, without referring to any physic – same thing as we have with the LSQ fit here).
Note for Steve Goddard: Papers on cointegration methods for CO2 and temperature show that it is rather difficult to find correlation between these two variables, at least in observed data (not results of computer models). (please refer to CA and dr Stockwell blog).

noaaprogrammer
April 25, 2010 1:29 pm

Between the dates of past solar sunspot minima and maxima, what are some of the longest stretches of consecutive days without sunspots?

April 25, 2010 1:29 pm

Surely if one starts the series 100yrs earlier the linear anomaly would be less steep;
http://members.casema.nl/errenwijlens/co2/europe.htm
and the next negative oscillating anomaly would be potentially deeper.

April 25, 2010 1:32 pm

Hockeystickler (12:59:23) :
Over the last 30 years, HadCrut and GISS correlate reasonably well with satellite data. Prior to that their slopes were much lower. So answering your question, I don’t see any reason to believe that the errors in the GISS database are having a huge impact on the graph.
Roy Spencer and other well known skeptical scientists agree that we should see warming close to the non-feedback response of Stefan–Boltzmann
http://en.wikipedia.org/wiki/Stefan-Boltzmann_law
That seems to be pretty close to what is happening.

April 25, 2010 1:34 pm

Nice piece of work – in fact I made it the second post at my blog (www.aetherczar.com) I’m kicking off this weekend. I took the liberty of re-posting one of the figures back to my blog to save you any bandwidth fees from my re-post. You don’t appear to have a policy on the subject, so I assume you wouldn’t mind. If you do, let me know and I’ll take any appropriate action.
Watts Up With That has become a regular stop for me, thanks to the high quality and insightful posts like this one.
Hans

Archonix
April 25, 2010 1:37 pm

Richard Telford (13:23:18) :
Your model describes the data passably, but has no predictive power
Apparently it shares this particular attribute with the IPCC models.

Anton
April 25, 2010 1:42 pm

“For the period from 1880 to 1940, the average emission of CO2 was about 0.8 G-ton, and the increase in the GMTA was 0.13+0.22=0.35 deg C. For the period from 1940 to 2000, the average emission of CO2 was about 4 G-ton, but the increase in GMTA was the same 0.48-0.13=0.35 deg C. This means that an increase in CO2 emission by 4/0.8=5-fold has no effect in the increase in the GMTA.”
It’s not a five-fold INCREASE in Co2 levels. It’s four-fold increase (i.e., the amount ABOVE the original). It’s five times as much (4 divided by .8 = 5) but four times MORE than: that is, a four-fold INCREASE. 🙂

Al Gored
April 25, 2010 1:43 pm

“As a result, the claim by the IPCC of climate catastrophe is not supported by the data.”
I’m shocked! Speechless! LOL.
But I agree with astonerii that this study is limited by the use of CRU junk data. But if even that skewed data shows this, well…
David Ball (12:24:33) wrote: “The author also attended UBC. Suzuki’s alma mater. Wonder how he/she managed to fly under the radar there?”
Suzuki was recognized as an egotistical blowhard when he was at UBC, and as I’m sure you know he was in genetics – back when that was a very simple science.
Now that he’s a TV eco-evangelist with a Jim-and-Tammy-Faye-like “Foundation” he has more influence, and would no doubt like to throw this author in jail.

Pascvaks
April 25, 2010 1:44 pm

Ref – Alexander (12:36:19) :
“Excellent research and superb exposition. My only worry is as follows: if the Cru data base is biased upward due to the increasing UHI effect not being corrected for, this research suggests that instead of 30-yr warming then cooling cycles, we are actually heading toward 30-yr cooling then 30-yr increased cooling periods. Or have I missed something?”
_____________________________
I don’t think you’ve missed anything at all.

L
April 25, 2010 1:46 pm

Masterful! Also happens to coincide with my own impression of what’s going on, based on a three year regular reading of the acticles on WUWT.
More importantly, the math is straightforward, seemingly airtight, and the piece is easy enough for even a caveman to understand. This article is very much needed and needs as wide a dissemination as possible, given that the attention of Joe Sixpack lives by the KISS principle. Congresspersons, too….

April 25, 2010 1:47 pm

stevengoddard (12:37:38) :
“Both HadCrut and GISS show a good correlation between CO2 and temperature.”
Steven: Is this not to be expected, considering that HadCRU and GISS are known to have “correlated” their data; e.g. flattening the warming period in the early part of the last century? We also know that over the past 200 years the rate of increase in atmospheric CO2 has been much slower than both the rise in world population and the resulting increase in man-generated CO2 – why is that? Aye, Bob.

DirkH
April 25, 2010 1:50 pm

“Richard Telford (13:23:18) :
[…]
Useful models try to incorporate the physics of the system. Numerology is not physics.”
But models that try to model the physics yet fail to explain what’s happening in reality are clearly useless.

April 25, 2010 1:52 pm

Richard Telford (13:23:18) :
If the progressive increase in the GMTA maxima are largely UHI driven, the slope of the long term curve flattens dramatically and the Ice Age temperatures are significantly higher. See the astonerii (13:28:08) post immediately following yours.

Massimo PORZIO
April 25, 2010 1:52 pm

astonerii (13:28:08) :
“As long as your using bogus temperature readings to make your hypothesis you will never have a valid hypothesis. 1934 was warmer than 1998, thus the whole idea is bunk and garbage. It seems strange that the skeptics would indulge the alarmist warmists by using garbage data”
I agree, I can’t imagine how to create a “model” using corrupted data.
If the data are wrong, any modeller should stop and scientists should collect the right data before.

pat
April 25, 2010 2:03 pm

It is always distressing when a politically driven hoax is found out.

April 25, 2010 2:10 pm

Bob(Sceptical Redcoat) (13:47:19) :
There is a possibility that the temperature data sets have been adjusted to match CO2.
Nevertheless, the projections are not far off what one would expect from physics, and supports the author’s assertion that we are not headed for a catastrophe. Nothing in the temperature record seems to support that, and in fact the low end IPCC projections don’t support the idea of catastrophe either.
The problem is that the press and government pick up on Hansen’s high end projections.

MartinGAtkins
April 25, 2010 2:11 pm

stevengoddard (12:37:38) :
Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm
Velocity cannot be calculated without time.

Richard Telford
April 25, 2010 2:13 pm

astonerii (13:28:08) :
Let me guess your nationality. American?
You have a overinflated view of the importance of the United States climate. In the US, 1934 was warmer than 1998, though not significantly so. Globally it was substantially cooler. But don’t let reality get in the way of a good rant.

WAM
April 25, 2010 2:14 pm

Dear Author,
People nowadays really do better.
Please read dr Stockwell:
http://landshape.org/enm/best-fit-integrated-model-of-global-temperature/
Your contribution goes along IPPC “trends”, subjected to things like range of years to be analysed and functional form your model employ.
Your modelling is really too simple when one thinks that the temperature record is produced by a real dynamics of a global atmospheric circulation, forced by solar cycles, seasonal albedo changes, etc. The biggest drawback is that according to discussions on various blogs the temperature record contains so called unit root, so LSQ fit is spurious.

Ron Pittenger, Heretic
April 25, 2010 2:16 pm

Great article. I hope we get the ten years, but the way to bet is we in the USA have to survive the next 8 months first. Then, with luck and hard work, the next few years. After that, it ought to be fairly obvious, one way or the other. Thanks.

WAM
April 25, 2010 2:17 pm

@ Steve Goddard
It would be wort that you look on contribution of dr Stockwell and VS (and some papers – BR) on the cointegration of CO2 and temperature.

Tenuc
April 25, 2010 2:17 pm

@Girma Orssengo:
Thanks for a well written and very interesting article. The 60 year warming/cooling cycle is nicely shown as is the way you politely rubbished the IPCC ‘accelerated warming trend’ claim. The underlying warming needs an explanation and I suggest you look at the following areas:-
Solar activity (TSI/Solar wind speed&density/magnetic field strength).
UHI effect in Hadcrut Data.
Changes in polar ozone concentrations.
Effect of large equatorial volcano eruptions.
Changes in land use.

Invariant
April 25, 2010 2:21 pm

John Cooke (13:13:35): As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
I agree. A curve fit is just a curve fit. A better approach would be to start with the conservation laws and the laws of thermodynamics:
http://en.wikipedia.org/wiki/Conservation_law
http://en.wikipedia.org/wiki/Laws_of_thermodynamics
In addition there are some energy dissipation and entropy production extremal principles:
http://en.wikipedia.org/wiki/Extremal_principles_in_non-equilibrium_thermodynamics
Then remember that temperature as such is not a good measure of the state of our planet. Oh, there are so many forms of energy…
dU = TdS – pdV + µdN

Gerry
April 25, 2010 2:24 pm

OMG! In 10,000 years we’ll be soup! Or not. Maybe. Possibly. Perhaps….

HAS
April 25, 2010 2:33 pm

I do wonder at the wisdom of continuing to publish work that doesn’t first test the data to show the validity of the assumptions underpinning the statistical methods used in the inferences, and second use statistical techniques to test assertions about trends and break points.
This should be all about doing better science (physics and stats) than IPCC, not just producing more of the same (it went up, they put a line through it; it went down, I put a line through it; it went up and down, I put a cosine curve through it).

Huub Bakker
April 25, 2010 2:37 pm

Off topic, but I see that Andrew Weaver of the University of Victoria has decided to sue the The National Post for Libel.
http://arstechnica.com/science/news/2010/04/climatologist-sues-for-libel-demands-copyright-of-articles.ars
This should be good since it will since sceptics’ arguments will finally be given their day in court.

Huub Bakker
April 25, 2010 2:39 pm

Oops. How about:
“This should be good since sceptics’ arguments will finally be given their day in court.”

Brent Hargreaves
April 25, 2010 2:41 pm

Richard Telford (13:23:18) : “Useful models try to incorporate the physics of the system. Numerology is not physics.”
You’re right, Richard. I am a big fan of WUWT, but this posting by Mr. Orssengo contributes nothing to the Great Debate; is below the usual high standard.
He had a lengthy battle with a bunch of warmists over on the Deltoid website, and I’m sorry to say that they wiped the floor with him. (Which was a pity – I was rooting for him.)
I have no desire to belittle Mr. Orssengo, but would avdise caution in assessing the above posting. We sceptics must apply our scepticism consistently in our quest for truth.

kadaka (KD Knoebel)
April 25, 2010 2:46 pm

From astonerii (13:28:08) :
It seems strange that the skeptics would indulge the alarmist warmists by using garbage data.
Nah. There are two main avenues of attack available against the CAGW proponents:
1. Attack their underlying data and concepts.
2. Attack them with their underlying data and concepts.
#2 proves surprising useful. They contradict themselves, in the small theories that build up or build upon CAGW, and in their data (the tree ring divergence problem for example). If you can take their own data and show it doesn’t support their own concepts, go for it. That’s a normal approach in science. And for CAGW, it saves the step of showing the data was garbage to begin with, which can be a rather hard process.
Sure, we can see the records were tampered with and shouldn’t be considered reliable. So what? If their own distorted data ultimately won’t support their concepts, then their own distorted data is also a usable tool to tear those concepts apart. If #2 does the job, then later on we can point out how #1 shows it was all nonsense anyway. Meanwhile they keep using those questionable records to build their theories higher and higher, which naturally provides more and more fodder for #2…

Stephan
April 25, 2010 2:54 pm

OT but I thought it NEVER has rained in the Atacama Desert?
http://wxmaps.org/pix/prec8.html

April 25, 2010 2:55 pm

The Ghost Of Big Jim Cooley (13:20:37) :
Sorry, I’m not one to usually nitpick, but the Hadley Centre is a noun (proper). So it CANNOT be Hadley Center – you can’t just change it to the American spelling. It is Hadley CenTRE, not Center. Sorry, but we English get a little fed up about the corruption of the language.

Ah, yes, “cenTRE,” the French spelling.
😉

Stephen Wilde
April 25, 2010 2:56 pm

“John Cooke (13:13:35) :
As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
As such, any predictions made with it are purely based on the (relatively recent) past observational history. It makes no attempt to understand the underlying physics.”
As regards the underlying physics and the real world mechanisms involved I submit that this useful mathematical exercise would mesh with my New Climate Model quite nicely.

April 25, 2010 2:59 pm

MartinGAtkins (14:11:43) :
dT/dt / dCO2/dt = dT/dCO2

DirkH
April 25, 2010 3:00 pm

“Richard Telford (14:13:19) :
astonerii (13:28:08) :
Let me guess your nationality. American?
You have a overinflated view of the importance of the United States climate. In the US, 1934 was warmer than 1998, though not significantly so. Globally it was substantially cooler. But don’t let reality get in the way of a good rant.

Richard, how much is “substantially” for you?

tonyb
Editor
April 25, 2010 3:00 pm

Henry Galt
Metoffice details as follows.
metoffice.gov.uk
enquiries@metoffice.gov.uk
By postMet Office
FitzRoy Road
Exeter
Devon
EX1 3PB
United Kingdom
Tonyb

kwik
April 25, 2010 3:06 pm

Isnt this post missing a plot on how the formula is showing the temperature to unfold, say, the next 10 years? Not to mention the rest of 2010? Or did I miss it?

Dr T G Watkins
April 25, 2010 3:06 pm

Tucker, John Cooke et al.
My understanding of Girma’s post is that he is mathematically analysing the temp. data as produced by CRU (it is easy to get mixed up with Met. Office Hadley, Exeter ), making no comment on the data validity, nor on any underlying physics. He is merely pointing out the cyclical nature of their data and using that analysis as a predictive tool.
It will be interesting to see what the next ten years or so bring; I hope I’m still here to see it.

David L. Hagen
April 25, 2010 3:25 pm

See: Don Easterbrook’s AGU paper on potential global cooling for a similar temperature projection based on the PDO cycle. In 2001, Easterbrook predicted temperature would be declining until 2040, then increasing till 2060 and then declining till 2090. See papers at Easterbrook’s home page.

Philip Thomas
April 25, 2010 3:26 pm

Is this just throwing the IPCC models under a bus to keep the Warmist theory alive?

April 25, 2010 3:27 pm

John Cooke (13:13:35) :
“As such, any predictions made with it are purely based on the (relatively recent) past observational history. It makes no attempt to understand the underlying physics.
This lack of a proper understanding of the physics involved (including a lot of stuff that apparently is not included in climate models at present, such as solar activity effects) means that predictions made with it are not particularly helpful. If the parameters change, such as (for example) a change in solar activity that does not fit the pattern of recent years, then the ‘model’ has no way to take account of this.”
Totally. My findings on solar variation did enable forecasts for the recent N.H cold winters and wet summers (and the El Nino), and can easily hincast the coldest winters of the last 2 thousand years that we have records for, and follow most monthly anomalies on 351yrs of CET. I would still largely stick to this forecast;
http://landscheidt.auditblogs.com/2008/06/03/the-sunspot-cycle-and-c24/
although my outlook for 2025 to 2038 is of a very warm period. Harris Mann also have this period marked as hot and dry. If it is to be dry, then there would have to be an absence of significant temperature drops during summer months, such as we have had in the last few summers, and no serious temperature rises in winter months.

MinB
April 25, 2010 3:33 pm

Some time ago I read a post contending that complex systems, like weather, are more accurately predicted by mathematical equations than by systems analysis. I guess this is an attempt to prove that. Even if this particular effort has some flaws, I’m interested in seeing the concept tested.

MartinGAtkins
April 25, 2010 3:36 pm

Alexander (12:36:19) :
Excellent research and superb exposition. My only worry is as follows:
if the Cru data base is biased upward due to the increasing UHI effect not being corrected for, this research suggests that instead of 30-yr warming then cooling cycles, we are actually heading toward 30-yr cooling then 30-yr increased cooling periods.

No one is suggesting that all the warming since the 1970s is due to the UHI effect.

For the period from 1910 to 1940, the increase in GMTA was 0.13+0.64=0.77 deg C, giving a warming rate of (0.77/30)*10=0.26 deg C per decade. Similarly, for the period from 1970 to 2000, the increase in GMTA was 0.48+0.29=0.77 deg C, giving an identical warming rate of 0.26 deg C per decade.

The exact match between the two warm periods is only statistically fortuitous. Values of equivalent magnitude are enough in a projection to validate the study provided the difference is not statistically significant.
We have no idea what effect UHI has on the CRU data but values of say .01 or .03 over a thirty year period would not alter the broader scope of the study.

gogirma
April 25, 2010 3:37 pm

lol. this is great. go girma. however, you should change the model so 2003 – 2100 is just a mirror image of 1880 -2002. you need to make the linear trend a low frequency triangle wave oscillation.
I look forward to the update.

davidmhoffer
April 25, 2010 3:38 pm

I don’t understand the criticisms of this analysis. Yes itz a mathematical model that does nothing to explain the physics but it has tremendous value. It clearly shows that there is a 60 year climate cycle superimposed on a gradually rising trend that for the time being I will assume is part of a larger cycle or cycles too long in time span to be visible in this short a data set. The point is that NEITHER correlate to CO2 theory. So the question becomes, what the heck DO they correlate to and what are the physcis of THOSE processes?
BTW my 1974 Encylopaedia Britannica (which is an excellent source of climate information uncontaminated by ridiculous politics) talks about the data in terms of 60 to 75 year cycles. Too bad we didn’t spend the last 40 years figuring out what their driving factors were instead of trying to tax them. Instead we “forgot” that this is not new, and have to rediscover it again. Like the earth was flat, then the Greeks said no itz round, and then it was flat again for a few hundred years and now itz round again.

April 25, 2010 3:55 pm

Henry Galt (12:29:59) : You asked, “Does anyone have any current address for the Central England Temperature (CET) series please?”
Daily CET temps from 1772 are available through the KNMI Climate Explorer and they look current:
http://climexp.knmi.nl/getindices.cgi?UKMOData/daily_cet+Central_England_Temperature+t+someone@somewhere+366
And the monthly CET data from 1659 is also available there:
http://climexp.knmi.nl/getindices.cgi?UKMOData/cet+Central_England_Temperature+t+someone@somewhere

April 25, 2010 3:57 pm

The model is similar to Syun-Ichi Akasofu’s projection (see: http://www.appinsys.com/GlobalWarming/GW_TemperatureProjections.htm)
While strictly mathematical models that do not contain representations of the underlying causative factors have limited value, Hansen and IPCC models that misunderstand the causative factors are no better. Their models basically come down to GHG as the only significant forcing (see: http://www.appinsys.com/GlobalWarming/HansenModel.htm)
The mathematical models that simply reproduce the 60-year cycle with an upwards trend don’t include the underlying longer-term cycle since we don’t have observations from a sufficiently long time frame to know what the cycle length may be. (Or even what causes these cycles.)
My analogy is this: we have been measuring temperatures for two days (each day is the 60-year cycle). Today was warmer than yesterday (because it is spring). We haven’t been measuring long enough to know how long until summer (or what the length of the “annual” cycle is).

Richard M
April 25, 2010 3:58 pm

This type of analysis which is based on past history is only valid for a short time in the future. Maybe 30 years at most. As a result few AGW followers will take this seriously.
It’s like looking at the stock market history for any short period of time. Quite often you can predict the future but eventually something not incorporated in the model comes into play and everything changes.

April 25, 2010 3:59 pm

roger (13:08:57) : Regarding your reply to Henry Galt, “they are indeed unobtainable,” scroll up a comment or two to my reply to Henry. The data is available through KNMI.

April 25, 2010 4:03 pm

astonerii (13:28:08) : You wrote, “1934 was warmer than 1998…”
Are you discuss U.S. temperatures or Global temperatures? If global, are you discussing land or sea surface tempertures?

StarBP
April 25, 2010 4:05 pm

The model predicts a steadily increasing temperature. However, that is NOT true. There is a roughly 178-year cycle superimposed on top of a larger 356-year cycle for temperature based upon the solar cycles. (Some even suppose a ~1424-year cycle, but temperature records before the 1600s are virtually nonexistent). These cycles both contribute to a slow warming for roughly 80-90% of the time, followed by a steep drop (4-9X the rate of the rise) during the other 10-20% of the time. Currently both cycles are near the top. Numerous predictions of the solar cycle agree that a solar minimum of strength between the Dalton and Maunder is coming starting in Cycle 24 (now), strengthening in Cycle 25. Livingston and Penn provide a mechanism for the original Maunder Minimum, and also an extrapolation that produces the same characteristics within 10-15 years. Should a 1,424-year cycle exist, it is also due to go down fast during the same time. Oh, and don’t even get me started about the Milankovitch cycles.

INGSOC
April 25, 2010 4:06 pm

Thank you Dr Orssengo. It will take a bit more time to review the references, but this is a thorough and concise presentation that adheres to fact rather than fiction. Devilishly delightful that you thought to credit HADCRU for the data! Truly a refreshing read!
Here’s to many more such truths, made so clear and plain. It’s Occam approved!
Cheers!

R. de Haan
April 25, 2010 4:07 pm

MUST READ: The Greenhouse Effect: Origins, Falsification, & Replacement by Timothy Casey B.Sc. (Hons.)
Sunday, April 25th 2010, 4:58 PM EDT
Co2sceptic (Site Admin)
This is a MAJOR paper that Hans Schreuder informed Alan Siddons about today
A few choice plums:
#Everyone knows what the greenhouse effect is. Well … do they? Ask someone to explain how the greenhouse effect works. There is an extremely high probability that they have no idea.
#Beware of wheels within energy diagrams as these usually constitute the energy creation mechanism of perpetual motion machines. One such gem of clarity, used uncited by Plimer (2009, p. 370), was offered by Kiehl and Trenberth…
#The mechanism by which the addition of carbon dioxide warms the atmosphere has no empirical basis. Therefore the assertion that global warming is anthropogenic, may well be philosophical and perhaps political, but it is most certainly not scientific.
#Increasing visible radiation, even by quite a large amount, results in no measurable rise in temperature because no appreciable amount of visible radiation is converted into infrared when absorbed and re-emitted – contrary to Arrhenius’ hypothesis.
#Tyndall’s confusion of absorption and opacity is a major error that was propagated into Arrhenius’ Greenhouse hypothesis, and constitutes a fact not accounted for in Arrhenius’ calculation of “Climate Sensitivity” to carbon dioxide.
#Although the greenhouse effect died with the Wood experiment, the diverse multitude of radiation “budgets” shows that the greenhouse effect is far from buried. This is a classic case of shifting the goalposts, because the greenhouse effect is not a scientific hypothesis that can be buried when it dies from experimental causes; it is a political symbol that cannot be allowed a proper burial, and so remains forever on display at the funeral parlor; an eternal viewing just like Lenin’s.
By the way, he’s an Aussie.
Alan S
Download the PDF here: http://climaterealists.com/attachments/ftp/The Greenhouse Effect Origins Falsification Replacement by Timothy Casey3.pdf

u.k.(us)
April 25, 2010 4:08 pm

Another study that blows-up the IPCC models, whats not to like.
It’s not making predictions.
Seems like it validates climate variability.
Al Gore et al., are running out of time, we are now on the downslope of the temperature curve.

April 25, 2010 4:09 pm

The ~60 year cycle has been found in temperature proxies extending back 1000-1500 years:
http://hockeyschtick.blogspot.com/2010/01/fourier-analysis-of-climate.html

Steven mosher
April 25, 2010 4:12 pm

Numerology.
You fit the “model” to the data and the model “fits” the data. The problem is the model has no physical meaning. Fitting a straight line to a cloud of points is not the same thing as finding the trend. that is merely finding a line that minimizes the error. estimating a trend is more complicated than that, see the thread on Barts blog where VS explains.
The other problem is your out of sample performance: If your model is a “physical” model rather than a merely a ‘curve fit’ then we can use it to hindcast.
a simple test: hind cast the GATA for 1850: hadcru3 has that, decidely warmer than your model hindcasts. Or hindcast back to the MWP. bad idea there. or build your “model” with just part of the data ( from 1850 to 1950)
see how you do on 1950-2010. or build it from 1940 to 2000 and then see how well it works for 2000-2010 and from 1850-1940
My sense is that you are long way worse than the typical GCM. That’s not an endorsement of GCMs, but they have more skill than your “model.”
http://climateaudit.org/2008/05/09/giss-model-e-data/

pat
April 25, 2010 4:14 pm

25 April: Daily Mail: The ash cloud that never was: How volcanic plume over UK was only a twentieth of safe-flying limit and blunders led to ban
By David Rose, Matt Sandy and Simon Mcgee
The Mail on Sunday can today reveal the full extent of the shambles behind the great airspace shutdown that cost the airlines £1.3 billion and left 150,000 Britons stranded – all for a supposed volcanic ash cloud that for most of the five-day flights ban was so thin it was invisible.
As the satellite images of the so-called ‘aerosol index’ published for the first time, right, demonstrate, the sky above Britain was totally clear of ash from Iceland’s Eyjafjallajoekull volcano…
Attempts to measure the ash’s density were hampered because the main aircraft used by the Meteorological Office for this purpose had been grounded as it was due to be repainted.
Computers at the Met Office, which earlier forecast a ‘barbecue summer’ last year and a mild winter for this year, produced a stream of maps predicting the ash would cover a vast area, eventually stretching from Russia to Newfoundland. But across almost all of it, there was virtually no ash at all, and none visible to satellites….
As the Met Office is responsible for forecasting ash for Europe, air traffic controllers across the continent soon followed the UK lead, closing down aviation…
Unfortunately, the Met Office’s main research plane, a BAE 146 jet, had been stripped of its gear ahead of a paint job, so could not fly until last Tuesday – the last day of the ban..
http://www.dailymail.co.uk/news/article-1268615/The-ash-cloud-How-volcanic-plume-UK-twentieth-safe-flying-limit-blunders-led-lock-down.html

Charles Higley
April 25, 2010 4:16 pm

A wonderful article which brings it all together.
However, asking them to release the world from worrying about global warming is to ignore the fact that this is a global political scam with the goal to alter the world’s power structure, make governments in control of all energy and, thus, every aspect of our lives and finances, and form a one world government (the UN has high aspirations).
They have no intention of informing the public that there is no problem. They are the problem and only we can educate the public by spreading the real science of our climate as widely and loudly as possible.
This is a propaganda campaign to confuse the public enough to allow the passage of falsely based legislation to further their agenda.
Why do you think there has been a move to make conspiracy theory and theorists illegal? The global warming/climate change scam is one huge conspiracy!

pat
April 25, 2010 4:25 pm

our daily dose of alarmism, already being covered by ABC australia:
25 April: Nature: Richard A. Love: An oceanic ‘fast-lane’ for climate change
But estimates of its speed, taken as “snapshots” by instruments deployed from research vessels, had been “all over the place”, says Steve Rintoul, a physical oceanographer at the Antarctic Climate and Ecosystem Cooperative Research Centre in Hobart, Australia, and a co-author of the new study1.
Yasushi Fukamachi, an ocean scientist at Hokkaido University in Sapporo, Japan, led a team effort to determine the exact nature of the current…
This is significant because it represents a “fast lane” by which climatic and environmental changes affecting the Southern Ocean can propagate northward, says Alejandro Orsi, a physical oceanographer at Texas A & M University in College Station, who was not involved in the study..
Understanding such currents could help scientists to predict how the world will react to increasing levels of carbon dioxide, says Richard Alley, a geoscientist at Pennsylvania State University in University Park..
But the currents could change. “We’re not saying this could happen instantaneously, like the movie The Day After Tomorrow,” Fukamachi says, “but understanding this kind of current is very important to understanding global climate.”…
http://www.nature.com/news/2010/100425/full/news.2010.201.html?s=news_rss

Daniel H
April 25, 2010 4:28 pm

Since you asked for criticism…

“Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming.”

This is silly. No one consciously stands at the gas pump thinking: “Gee, I’m so proud at how far civilization has come thanks to these fossil fuels that I’m pumping into the gas tank of my modern internal combustion engine.” More likely, they’re thinking: “Hmmm, I wonder if those doughnuts on sale at the Quicky Mart are still fresh?”. As such, I would suggest the following minor revisions to your sentence (in italics):

Fossil fuels have allowed humankind to become more productive and achieve the highest standard of living in recorded history, yet the IPCC asserts that their continued use will cause catastrophic global warming.

Aside from that, the only other criticism I have about your essay is that it reads more like a compendium of user comments stitched together from the pages of WUWT rather than anything resembling a scientific critique or rebuttal to CAGW. For example, you cannot simply cite an arbitrary comment from the Climategate emails and make the claim that it conclusively proves or refutes anything about the “science” of global warming. Other people smarter than you have already tried that and have been rebuffed based on legitimate arguments involving context and certain other technicalities related to the casual and private nature of the discussions. In other words, It’s not that simple.
Nice try though.

jorgekafkazar
April 25, 2010 4:32 pm

An interesting post. Within its limitations, it shows that the IPCC alarmograph is deceptive.

John Finn
April 25, 2010 4:35 pm

David L. Hagen (15:25:54) :
See: Don Easterbrook’s AGU paper on potential global cooling for a similar temperature projection based on the PDO cycle. In 2001, Easterbrook predicted temperature would be declining until 2040, then increasing till 2060 and then declining till 2090. See papers at Easterbrook’s home page.

Perhaps it would be useful to validate Easterbrook’s projections now that the first decade is complete (Easterbrook predicted cooling from 2000).

Editor
April 25, 2010 4:36 pm

Politely, as a long time propoenent of cyclical behavior in climate outcomes, I must midly disagree with a few points and assumptions made by the authors, and very strongly disagree with a few of the criticisms.
1. The short 30 year cycle is bounding (enclosing) almost all of the 30 year cyclical data points. The 30 year mathematical curve should instead have a smaller coefficient such that it predicts the average trend of the year-to-year curve, rather than be higher than max points, and lower than the min points. The equation period appears correct though, and explains much of the current propaganda (“This decade is the hottest ever” etc…) about the climate.
2. The linear term should be replaced with a 800 year periodic cycle rising from the depths of the 1600’s Little Ice Age, peaking at the 1150’s Medieval Warming Period, dropping again in the Dark Ages, peaking again earlier at the Roman Warm Period. This change will NOT change the ternd in the next 60 years, but will show that the next 200 years will begin a “significant” long-term decline.
3. Plotting the 60 cycle on top of the longer 800 cycle will allow the rpevious descrepancies in the past 1200 years of data to be resolved: A temperature proxy at ANY given year that “now” appears “off” significantly from other studies thanuse a date only 30 to 40 years different will, after getting plotted on a cyclical trend line, either fall into place, or allow a correction to the short range cycle.
Perhaps 62 or 66 or 72 years will prove more correct. Perhaps (more likely even!) is that the short term cycle is not “prefectly periodic” but reflects a dynamic whose PERIOD itself changes slowly over time.
4A. Several readers have condemned this paper “because it does not have a theory of “why” their is a period, but instead merely claims that there IS a short-term (and – as suggested above) a long-term period of climate temperatures.
Fine. So be it. (For now.) When continental drift was proposed as a theory, there was NO theory (or even a practical mechanism) to explain”WHY” the continents moved, but the theory began with the simple and elegant observation that “They DID move (in the past).” Later, as earthquake patterns became more clear, and undersea vents and ridges were discovered – though at the time of the theory, there was NO WAY to ancipate either line of reseach, no way to predict satellite radars, undersea mounts, etc!)
When the Periodic Table was laid out, there was no theory abut WHY the elements exhibited their patterns, but merely that EACH element FOLLOWED a specific repeating pattern, and that PATTERN was enough to predict the behaivor of future (entirely unknown) elements.
Copernicous, Brache and others could use simple circles for orbits, and such orbits were “good enough” to serve as a new models of the solar system – though the “theory of gravity” was NOT available to explain “why” a planet could orbit a distant object. Refinements are needed – just as eliptical orbits proved more accurate than Grecian-old plots.
Be humble – Recognize that the Greeks were more accurate in predicting motion than the “perfect circles” that attempted to replace them. It took many years for the more elegant, more accurate elliptical paths to come forth.
But even a “simple mathematical model” IS sufficient to show that a complex CO2-based fabrication of radiation feedbacks and artificial feedbacks is “false.”

April 25, 2010 4:40 pm

I have used this curve fitting technique on CO2, ice core, sea ice, and SST (http://www.kidswincom.net/climate.pdf) and your single cycle superimposed on a straight line is too simple to be used as a predictor. The data shows superimposed multiple wave length cycles and a linear fit is more likely a sigment of a long term cycle. A cycle common among the various data has a wave length of around 325 years. A 1100 year cycle could explain the MWP and LIA. These are natural cycles that CO2 follows rather than causes.

Editor
April 25, 2010 4:47 pm

Others have already commented along these lines, but WTH – if enough comments like these are added we could create a new ‘consensus’…..
This mathematical model just shows that there has been a 60-year cycle for the last 100+ years, that the 20th century warming has been overstated (and hence that AGW has been overstated, because AGW is based on the amount of warming in the 20thC), and that the IPCC climate models are unsafe (because they do not recognise and cannot explain the cycles).
But that’s about all.
Without an underlying mechanism, we have no way of knowing whether the 60-year cycle will continue in the same pattern. This new model therefore cannot be used for prediction.
Please note that the IPCC (Trenberth) has stated that its models cannot be used for prediction. The fact that the IPCC models have been ruthlessly milked for every possible prediction under the sun does not imply that this new model can be used for prediction.
What should happen – what should have started happening long ago – is that all model predictions should be scrapped, all political activity based on such predictions should cease, and scientists should get on with real research into the mechanisms of climate until the point is reached that we really do have understanding of how climate works.

April 25, 2010 4:49 pm

John Cooke (13:13:35) :
As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
I agree, this is just curve fitting a posteriori. But so is so much that goes for ‘science’ these days.

April 25, 2010 4:53 pm

Richard M (15:58:00) :
I tried out a stock fitting program which very accurately developed a fifth order equation for modeling past behaviour, but had no skill at predicting future behaviour.
“With four parameters I can fit an elephant, and with five I can make him wiggle his trunk.”
Johnny Von Neumann

thelastpost
April 25, 2010 4:58 pm

The fitting to recent climate fluctuation presented by Girma Orssengo is in a way analogous to parts of Nicola Scafetta’s recent posting where he fitted the recent solar cycle (sunspot) wavetrain of 4 cycles to a historic wavetrain (at one of the previous solar minima) showing close similarity. OK no physical model, but if you can demonstrate enough oscillatory cycles, then characterising them and projecting forward is of some interest.

Bill Illis
April 25, 2010 4:59 pm

If one is looking for the driver of this 60 year cycle, it is most likely the Atlantic Multidecadal Oscillation (AMO).
Here is the AMO and ln (CO2) [for the upward trend] against Hadcrut3 back to 1854 on a monthly basis.
It is hard to miss the correlation (throw in some ENSO variability as well and one gets pretty close).
http://img121.imageshack.us/img121/9162/hadcrut3amoco2t.png

April 25, 2010 5:04 pm

Wow. I need to read this over a couple to times to make intelligent comments. However, We must I think be just a little careful with our language. This quote is example: “This conclusively proves that the effect of 20th century human emission of CO2 on global mean temperature is nil.” No it proves nothing, it does falsify the hypothesis that the global temperature increase during these periods was caused by increases in CO2. Remember please we do not prove things, we falsify them. Only the Sophists and Propagandists claim to do that which logically they can not prove something in science.
My wife is calling me to domestic duty….more on science later.

Bruce of Newcastle
April 25, 2010 5:05 pm

Similar study with a 65.7 year peak to peak sinusoidal curve fit to HadCRUT since 1850 here:
http://digitaldiatribes.wordpress.com/2009/02/10/deconstructing-the-hadcrut-data/
However the solar cycle 24 story may see the anomaly maundering off in a Dalton direction for another 1 C or more drop by 2030:
http://www.warwickhughes.com/agri/Solar_Arch_NY_Mar2_08.pdf
Plenty of grist in this paper to add a solar cycle periodicity term to the sinusoidal model.

MartinGAtkins
April 25, 2010 5:06 pm

Richard Telford (13:23:18) :
Your model is very useful. Extrapolating back in time to the early part of the last ice age, your model predicts temperatures a below absolute zero. I knew the ice ages were cold, but that is perhaps a tad excessive. Predicting forwards, we can determine how long it will be before the oceans boil.
It’s true that if we know the physical system and its drivers, we can model and perhaps predict outcomes. However when the system is chaotic and no hypothesis can explain an observed phenomena, it’s reasonable to assume the phenomena will remain intact.
There is no convincing hypothesis that explains the warming since the little ice age. They all have flaws or lack observable conformation. Faced with no identifiable cause of affect the trend remains a continuum.
The study takes the past and projects its apparent pattern into the future and is valid while no conditions change. Of course condations are and do change.
The model then is used to identify and quantify any event that may cause the model to deviate from it’s projected path.
Of course the phenomena may be just a product of our own desire to find meaningful patterns in random events.

Buddenbrook
April 25, 2010 5:22 pm

This article seems to be ideologically motivated nonsense. It is based in no physics. To expect such short cycles to give us prediction value is a weak argument. Look at the years 800-2000, do the cycles apply? NO. And on top of that the mathematical model (or whatever it should be called) is based on the very bogus “consensus” temperature data, that has come under criticism in the skeptic blogosphere. People who have praised the article seem to do so only because it gives them the right, i.e. desired, pre-concieved answers. This is called a confirmation bias and has got nothing to do with valid science. In fact this article is an example of the type of corrupted post-modern science, ala Mike Hulme, in which pre-concieved answers are written into the process and methods which then in effect merely posture as science.
Posturing here in vain I must add as most of the skeptics here seem to be critical of this article. And this is positive.
The OP shows his motives and colours with comments like these:
“Fossil fuels allowed man to live his life as a proud human, but the IPCC asserts its use causes catastrophic global warming.”
(What has that got to do with the model???) We should be especially proud of the two (I and II) 20th century ‘fossil fuelled’ global spectacles.
“the extremely irrational concept that the gas CO2, a plant food, i.e. foundation of life, is a pollutant because it causes catastrophic global warming.”
There are many things that are foundations of good things, but in excess would be very harmful things. So the concept isn’t irrational, in fact it is a plausible hypothesis. To claim it irrational is irrational. It’s just that the catastrophic scenarios built on that hypothesis do not seem to be strongly supported by the DATA nor by any considerable body of transparent high quality research. But the hypothesis in itself is plausible.
And you can’t deduce the weakness of that hypothesis from any “general” principles like “CO2 plays a part in many good things and therefore cannot do anything harmful to life on earth”. That is so aristotelian. Get on with the times, the Newtonian revolution didn’t happen yesterday.
if Michael Mann or Gavin Schmidt had written an article like this, but arrived at different results predicting catastrophic warming, ‘a McIntyre’ would tear it into a thousand shreds.
And if they added in between lines “mother earth” type of poetry to kind of support their arguments which is the equivalent “CO2 the foundation of life” you would see no end to the laughter.
Skeptics should not fall victims to the confirmation bias that is plaguing the ‘warmist research’ and jump on to support anything as long as it gives the desired answers and then praise it as good science. That would be embarassing and counterproductive in the end.

J.Hansford
April 25, 2010 5:43 pm

stevengoddard (12:37:38) :
Both HadCrut and GISS show a good correlation between CO2 and temperature.
———————————————————
That’s because HadCrut and GISS are designed to have a correlation….. It’s Claytons science. The science you do when you aren’t doing science;-)

April 25, 2010 5:48 pm

Mike McMillan (14:55:02) :

Sorry, but we English get a little fed up about the corruption of the language.

Yeah, I used to too. Then I discovered that a great deal of the US spelling, and even pronunciation, is merely ‘old’ English. What seems to me to have happened is that the UK has moved on, with a strong European influence, and the US has remained steadfast. They still use ‘our’ imperial measurements, too.
So the US have not corrupted ‘our’ language, they have just kept and refined ‘their own’ language. ‘Our’ language has changed in different ways.
Vive la difference!

April 25, 2010 5:50 pm

Sorry, last post should have been addressed to The Ghost Of Big Jim Cooley (13:20:37) :

MartinGAtkins
April 25, 2010 5:52 pm

stevengoddard (14:59:59) :
dT/dt / dCO2/dt = dT/dCO2
dt has no value. Data bunching is indicated.

BarryW
April 25, 2010 6:02 pm

While the equation does not explain why it is happening it does point to a probable “cyclic” phenomena that is not addressed by the GCMs. A number of people (myself included when I played with the temp data and R) have seen this in the data. The linear rates that are espoused by the CAGW proponents are most likely too high. This is oblivious by the 21st Century divergence from their projections. If the cyclic nature of the temperatures has any validity we should see a lowering of temperatures on the order of .3 or .4 deg over the next 20 years before they start rising again.

AC Adelaide
April 25, 2010 6:11 pm

Nice comment RACookPE1978, Clearly this model shows the longer term trend is not linked to CO2 and presents a more realistic two part break up to the trend. Great start. Now it is time, as you suggest, to expand the time horizon as you suggest and see how that trend works backwards and secondly to work out the “why” of what causes these two underlying trends. OK PDO is a start. The mathematical model gives us an insight as to where we might be going even if we dont know the “why”. The theory of “what” is the first important step in any theory of “why”. The problem with the AGWers was that they started with a theory of “why” (rising CO2) and allowed that to drive their mathematical modelling, which then forced them into data tampering to make their model work. They are not going to abandon their “why” until they are made to focus on the “what”. This simple model puts it front and centre for them to explain in the context of rising CO2.

Richard M
April 25, 2010 6:23 pm

stevengoddard (16:53:01) :
I tried out a stock fitting program which very accurately developed a fifth order equation for modeling past behaviour, but had no skill at predicting future behaviour.
Oh, it probably did. It’s just the time scale was so small that it was not useful. A week of stock market trading is probably equivalent to a 100 years of climate (or more),

April 25, 2010 6:32 pm

John Cooke (13:13:35) : – Exactly correct.
Dr. Orssengo,
First, I commend you for this undertaking, and to Anthony for posting this, and offer some comments based on my experience in similar modeling efforts, though in a different field.
I have approximately 25 years experience with both correlation models, and first principles models of complex heat-and-material-balanced, plus kinetic equations, for petroleum refining and petrochemical processes.
First principles models begin with known physics, and when performed properly, can be used to extrapolate into the future. These must include all variables, and correct equations for their behavior, which the GCMs clearly do not. Hence the GCMs are wrong at this time, and will remain wrong until they comply with the fundamentals.
Correlation models (as this one) take existing data, fit a curve to it, and can have pretty good fit. But these are often quite useless as predictors of future behavior, exactly as John Cooke stated at 13:13:35.
Model builders in refineries and petrochemical plants have known these basic truths for decades. They were forced to use correlation models for many years, until the first principles models finally became available and robust.
There are several unknowns that could influence the future GMTA, in particular more or fewer volcanoes, or volcanoes with greater or lesser amounts of particulates and/or sulfates, greater or lesser cloud coverage (no matter the origin, cosmic rays or otherwise), to name only two.
Also, if I may, Dr. Orssengo, there appears to be a better correlation model for the GMTA historical data. As shown in your Figure 4, the linear fit line does not pass through data points (0,0), (0.2,0.2), (0.4,0.4), etc. This indicates that the correlation model will fail, with greater and greater error as one predicts farther into the future.
A brief inspection of Figure 4 shows that there appears to be some error in the data points at the lower left, for all data points with values less than -0.4 on the X-axis. These data points, if removed from the graph, would appear to allow the trend line to pass very close to or through the points mentioned just above, (0,0), etc. It appears that the problem exists with the twin peaks around 1895 – 1900 in your Figure 3.

Charles Higley
April 25, 2010 6:37 pm

There was a comment somewhere near the bottom here which mentioned that the longer term temperature changes should also be included. One of them mentions that the sinusoidal pattern may trend downward, a la the Little Ice Age.
Good point. Does it not bother anybody that the Holocene Optimum, Minoan, Roman, Medieval, and Modern Warm Periods are showing a clear downward trend?
Our next high will mostly like be cooler than our Modern high.

Dave McK
April 25, 2010 6:49 pm

I believe the author is doing what a person interested in extracting meaning from time series data would have done first thing.
Fourier transform.
Do NOT average.
Of course CO2 correlates, Telford. Effects always follow their causes – they never precede them.

April 25, 2010 7:02 pm

I think this analysis is useful in the near term to rule out any role of CO2 in climate ‘forcing’.
However, I have to agree with others who have posted stating that this model should not be extrapolated forward or backward in time beyond the next one or two cycles, and should not be relied on historically beyond one or two cycles from the start of this analysis in 1880.
I think this model is analogous to a 2D formula correctly describing a brief projection of a 3D trend. Because this model makes no attempt to define the rising GMTA, but only describes its attributes as projected into the short term, there is no long term applicability here.
Still, the mathematical analysis is commendable, and the definition of 20th century temperatures can be derived from the formulas given.
As others have stated, though, this can’t be the true picture, as not too long ago absolute zero is reached, and not too far from now everything would be a plasma.

WestHoustonGeo
April 25, 2010 7:10 pm

Quoting:
“Sorry, but we English get a little fed up about the corruption of the language.”
Commenting:
We know. That’s why we do it. 😉

Dave N
April 25, 2010 7:11 pm

Those blasted observations are continuing to screw up the models! Someone do something about them already.. before it’s too late!!!

April 25, 2010 7:15 pm

MartinGAtkins (17:52:39) :
LOL

Steven mosher
April 25, 2010 7:23 pm

Roger Sowell (18:32:26) :
“First principles models begin with known physics, and when performed properly, can be used to extrapolate into the future. These must include all variables, and correct equations for their behavior, which the GCMs clearly do not. Hence the GCMs are wrong at this time, and will remain wrong until they comply with the fundamentals.”
physics models do not have to include all variables. They have to include the relevant variables for the purpose at hand. For example, I may have a very high fidelity physics simulation ( model) of an aircraft in flight, or a bomb as it drops. In both cases, I may opt to use a model with less fidelity, with variables missing provided by solution is within the tolerances that my task requires. For example, if I want to calculate the range of the aircraft, I can probably do without a wide variety of system parameters. I can even “estimate” or “project” what the atmosphere will be like ( a standard day for example ) without actually modelling the atmosphere. I can “forecast” the winds I expect to encounter ( head wind of 20 knots ) without modelling the exact spatio temporal aspects of it. For other tasks, this might not be good enough, for example I may need a gust model. The same goes for the physical process of dropping bombs and predicting where they will land. Close enough is sometimes good enough for the task at hand. Like many generalizations about models the idea that all variables must be present is not factually correct. The same goes for the equations used to define the physics. Again, it depends on the task. Finally, A GCM, just like any law of nature will always be “wrong” in the trivial sense that it doesnt predict EXACTLY. The question is, is the model skillful to its purpose. does it ‘work’
does it work better than a naive model.
in 1850 the global temperature was X. a naive forecast ( willis’ null hypothesis) would predict that the temp would be X today.
C02 content is not information in his model. In fact there is no information in the naive model. In a naive model ( there is just natural variation) if its X in 1850, then your prediction for 2010 will just be X +- some large number. Hard to go wrong will that Null. but even the simplest physical model that attributed some warming due to C02 would beat that model for skill. Skill is about power in a test, but I’m off topic now.

Al Gore's Holy Hologram
April 25, 2010 7:30 pm

To be frank, this article could have been cut short simply by mentioning that nobody REALLY knows what the world’s median temperature really is or has been since temperature keeping began. All records have biases, are local, and then processed through befuddled computer algorithms to come up with a seemingly sensible average, which it really isn’t. It’s an average of the RECORDS, not the world. When we’re talking about tenths or hundredths of a degree, that matters a lot.
And to make it all the more laughable, nobody really knows how much CO2 is in the atmosphere, where it is all being produced, how much is produced, or has accurate knowledge of the CO2 cycle. All we have is some localised readings which again are parsed to come up with an average without knowing if it is accurate or what the carbon cycle has in store for us in the future.

Anticlimactic
April 25, 2010 7:43 pm

The skeptics viewpoint is mostly that climate is driven by the sun, leading to roughly 30 year periods of warming and cooling, moderated by a slight overall increase as we are still emerging from the ‘Little Ice Age’. The current view is that 1998 was the warmest year, and global cooling started in 2005. As the sun was unusually quiet for the past 3 years it is expected that the next 20 years of global cooling may be severe.
From this viewpoint the above is par for the course. The question is whether the sun’s behaviour means we are heading for a Dalton minimum, in which case the gradual upward trend will come to a dramatic end!
I was hoping that we would now be in a position to give a comprehensive description of the [possible] mechanism governing these 60 year cycles of cooling and warming. The effects of cosmic rays on climate seems to be gaining credence, more is known about the oceanic oscillations, and our knowledge of the sun is increasing rapidly. Is anyone in a position to put it all together to provide an approximate model of climate change?

dr.bill
April 25, 2010 7:44 pm

RACookPE1978 (16:36:30) :
Politely, as a long time proponent of cyclical behavior in climate outcomes, I must mildly disagree with a few points and assumptions made by the authors, and very strongly disagree with a few of the criticisms……

Good comments.
Another example is Max Planck and the blackbody radiation curve. His first step was curve-fitting. His second was to see what needed to be changed in the traditional analysis in order to produce the function that he had found. He never really believed his own results, but he was right anyway, and it led to the development of Quantum Theory. 🙂
The first attempts are seldom perfect, but this series of steps is perhaps the most common paradigm in Science. You do, of course, eventually need to back it up with something fundamental.
Some reading here: http://www.daviddarling.info/encyclopedia/Q/quantum_theory_origins.html
/dr.bill

Richard Sharpe
April 25, 2010 7:58 pm

What a wonderful streak of rule breaking by the IPCC:
http://bishophill.squarespace.com/blog/2010/4/25/ipcc-in-trouble-again.html

davidmhoffer
April 25, 2010 8:04 pm

Roger Sowell (18:32:26) :
“First principles models begin with known physics, and when performed properly, can be used to extrapolate into the future. These must include all variables
Steven mosher (19:23:09) :
physics models do not have to include all variables. They have to include the relevant variables for the purpose at hand>>
Itz a chaotic system, the possibility of including and correctly modeling ALL variables seems slight. What we need is to understand what the DOMINANT variables are. Since it is a chaotic system, trying to start by making a list of all the variables that we can think of, hope we didn’t miss any, and build a model from there seems futile. Does it not make sense to evaluate the data for repetitive patterns and then determine if they provide clues as to which processes are dominant?
The moon has an orbital variation of 18.6 years, sun spots 11 and 22, AMO is about 60. Does it not make sense, given the clear 60 year cycle presented above, that we investigate elements such as these which divide nicely into 60? Do they roughly coincide in some way with the temperature cycle? When we go backward or forward in time do they diverge and is there evidence that when they diverge the short term (60 year) cycle also disappears from the record?
If one is trying to prove a specific principle of physics, then start with a model of that element. But a chaotic system with thousands of elements requires that we narrow the number to the most significant elements and then refine from there. Seems to me this study pretty much shows that CO2 is not one of the elements to consider. But it provides important clues as to what the dominant elements are, and from that perspective, it has value.
Darwin came up with evolution by making observations and then searching for a scientific explanation. Newton supposedly got thinking about gravity upon being struck by an apple to the head. Galileo didn’t just decide one day that the earth circled the sun and then built a telescope to prove it. He made carefull obervations and came up with a theory that fit the data. There is nothing “unscientific” about obervations providing clues as to which physics is important and which less so. Itz when you investigate the physics you THINK is important and it doesn’t match so you discard the data that you are no longer in the realm of science.

April 25, 2010 8:09 pm

Steven mosher (19:23:09):

In 1850 the global temperature was X. A naive forecast ( willis’ null hypothesis) would predict that the temp would be X today.

I’m not sure I understand what you’re saying there. The planet is still emerging from the LIA. There is a small natural warming trend, but there is no solid, empirical evidence showing that CO2 is the primary cause.
In fact, if the CO2 entity is kept out of the equation entirely, the result is the same as the null hypothesis, once again supporting Occam’s Razor: Never increase, beyond what is necessary, the number of entities required to explain anything.
Opinions about the effect of CO2 vary widely, from the IPCC’s preposterously high sensitivity numbers based on a CO2 persistence of over a century, to the other extreme, where CO2 is a negative forcing. But there is no empirical test showing the actual number. Nobody, in fact, knows the specific forcing caused by CO2, if there is any [I assume there is some CO2 forcing, but my assumption is based on physics, not on measurable evidence, because there is none. Please correct me if I’m wrong about that].
We do, however, have charts of past temperature trends showing that the steady rise in CO2 [only about 3% of which is emitted by human activities] does not correlate well at all with global temperatures: click

Girma
April 25, 2010 8:13 pm

Richard Telford (13:23:18)
You wrote,
“Your model describes the data passably, but has no predictive power, other than that of predicting your credibility – absolute zero”
The model was valid for 129 years. As a result, it is reasonable to assume that it will be valid at least for the next 20 years.
IPCC is wrong in its projection of 0.2 deg C per decade for the next two decades. But the model I described shows the current plateau followed by cooling until about 2030.
The question is whether we will have a cooling or warming globe in the next couple of decades.

Michael R
April 25, 2010 8:25 pm

In both cases, I may opt to use a model with less fidelity, with variables missing provided by solution is within the tolerances that my task requires. For example, if I want to calculate the range of the aircraft, I can probably do without a wide variety of system parameters. I can even “estimate” or “project” what the atmosphere will be like ( a standard day for example ) without actually modelling the atmosphere.

While this is true it is also completely irrelevent for modelling of CO2 and climate. Climate by its nature has a lot of variables that drastically affect the final “result”. Dropping a bomb on a location tends to be affected by only a handful.
For example, you drop a bomb from a low lying plane at 1000 feet. All you need to do is calculate any shear from wind, the slight movement from the moving plane and given the solid science showing how fast an object dropped when it weighs a certain amount and not only can you predict its location down to a few feet, but exactly when in time it will happen.
The analogy is the same for the aircraft example you used. While they sound nice, the original commenter you quoted is correct – in GCM’s not including variables that drastically affect the final result would throw out the result to beyond an acceptable answer. The problem with climate is that multiple small variables come together in a result that is highly susceptible to changes in those variables. Getting one slightly wrong and it is the equivalent of your bomber plane, leaving the carrier then setting a course 5 degrees to the east of his target. After an hour of travel time, using all the best methodology and science predictive power I am sure we could determine the exact location once dropped of the bomb – the only problem being we are several hundred miles in the wrong drection.
You seem to be confusing what is required to get even a close to accurate model for climate and comparing it to the far simplistic models you gave leaving the impression that we do a good job of predicting climate – we dont. We are terrible at predicting climate. We havent even been able to predict what happens just a year or two from now. In addition, the inability to predict short term and the constant complex nature of the climate means 10 and 20 year predictions will be even worse. Hey yes we can get a ball park figure – I can look at the graph and guess a spot and place bets that were well be – and yet the unfortunate aspect of that is that in reality, thats about as useful as most climate models created today are.

AusieDan
April 25, 2010 8:29 pm

I agree with this post.
I did some work on this issue last year, but as an unknown, it is impossible to get heard.
I suggest several minor changes:
Start in 1878 (at least notionally, although the first two years data is absent).
Use NCDC annual data, as this gives a much close fit.
Work with a 65 year cycle (half cycles 1878-1911, 1912-1943, 1944-1975, 1976-2008).
Use zigzags rather than cosine. I produced these from short term linear trends of the above half cycles, then added plus and minus 0.15 degrees to give the upper and lower bounds.
Use degrees C or absolute, not anomalies.
You will find that the annual NCDC data fits into these zigzag boundary lines very well and that even the 1998 El Nino just peeps over the paraphet.
My analysis agrees with the forecast for the 21st century.
But remember this is just a projection of the last 130 years history.
Should the sunspot – cosmic ray- cloud theory be correct, the current cycle may be about to end and we could be in for a much colder, more unpleasant time.
We have to wait some years yet before we can decide.
The bottom line is this if past conditions continue, we will be in for a fairly flat temperature for the remainder of our lifetimes.
If not, it may be too cold for comfort.

John Galt II
April 25, 2010 8:33 pm

Thanks for a great post.
Anthony Watts, and others have shown the Hadcrut data to be artificially warmed. You needed to use a data set.
What if the actual (not sure how we will find out) data is cooler by say 0.5deg C?
Thanks again for the great work.

AusieDan
April 25, 2010 8:40 pm

Steve Goddard – you said
“Both HadCrut and GISS show a good correlation between CO2 and temperature”
I suggest you try a correlation between say the Central England long term series which started in the 1600’s or the even earlier Central Europe (in 1525, I think).
You will find both of these have small strictly linear increases, while CO2 flatlined untill after 1850.
There is no cause and effect.
You are just looking at a meaningless short term correlation (130 years in this context is short term).
In haste as I have to go out to collect grandchildren, which is more urgent than (not) saving the world.
Regards.

jaypan
April 25, 2010 8:44 pm

Observing a sequence and derive a mathematical model behind is much better than
– have a single factor in a chaotic system
– which does not fit reality
– and then consider reality a travesty,
– because it does not follow a wrong theory.
And hey, the others don’t consider all factors either.

Andrew30
April 25, 2010 8:47 pm

Not A Carbon Cow (19:02:59) :
“As others have stated, though, this can’t be the true picture, as not too long ago absolute zero is reached, and not too far from now everything would be a plasma.”
However, if the model was encoded in computer language (which was kept secret); and the baseline data for the model was lost or misplaced; and the model was only run back so far as to be plausible; and the model was only run forward, to just beyond your retirement age, on a multi-million dollar super confuser:
Would it work then?
Could you sell it?
A world turning into plasma is actually a great end state.
Leave the lead time a bit obscure, add a tipping point, find two or three other planets in the universe that are more or less all plasma to use as examples. Hey you could say it was caused by the sudden re-emergence of the hidden heat that was wrapped up in the dark mater in the seventh dimension but must have been unleashed when the hidden heat density exceeded that carrying capacity of the dark matters inner dimensions.
It might also make a pretty good movie.
PS.
The problem with Girma Orssengos explanation is that the author has shown their work.

Gail Combs
April 25, 2010 8:56 pm

astonerii (13:28:08) :
As long as your using bogus temperature readings to make your hypothesis you will never have a valid hypothesis. 1934 was warmer than 1998, thus the whole idea is bunk and garbage. It seems strange that the skeptics would indulge the alarmist warmists by using garbage data.
_________________________________________________________________________________
Lighten up. The person had an idea and has the intestinal fortitude and humility to toss it into the WUWT arena for a critique. So please comment on the idea and do not attack the person. Leave that type of behavior to Mr. Mann.
And yes you are correct that he has used a “Mann-made” data set with added warming. That is the type of information he needs to go forward and modify his ideas.
Mann Made Data set
http://i49.tinypic.com/mk8113.jpg
The difference between raw and “adjusted” data sets USHCN
http://cdiac.ornl.gov/epubs/ndp/ushcn/ts.ushcn_anom25_diffs_urb-raw_pg.gif
Hansen Changing graphs
http://i31.tinypic.com/2149sg0.gif
US temp Raw vs “Adjusted”
http://i31.tinypic.com/5vov3p.jpg
European data:
http://i50.tinypic.com/301j8kh.jpg
Avg of thirty stations – raw data: (All 100 yr stations in Australia)
http://wattsupwiththat.files.wordpress.com/2009/12/darwin_zero3.png
This is the information he actually needs: http://www.john-daly.com/stations/stations.htm

AC Adelaide
April 25, 2010 8:58 pm

Exactly, Girma (20:13:39)
The mathematical model fits for the last 100 years and so can be used usefully to predict the next decade or so. Sure a meteorite might land on New York, a volcano may errupt here or there but even so the long term climate trends look pretty robust. If you look back the last 1000 or so years there are no huge dislocations in the trend so one might consider that in the next decade or so will also follow trend too. The point is, has this model got more credibility than the IPCC models in terms of their respective predictve powers in the light of the X trillion dollar main game that is being played.

April 25, 2010 9:00 pm

Steven mosher:
You are correct that I should have said all “relevant” variables, or perhaps “important” variables. There are a large number of irrelevant variables to most problems of interest, and it makes no sense to include irrelevant variables.
Still, my point is valid. When one omits variables such as volcanoes and clouds from a climate model, even one based on first principles, the effort is doomed to failure. Another very important variable is wind, including direction, strength, humidity, and duration. It also appears that dust-laden wind blowing into the Atlantic from Africa impedes hurricane formation. To my knowledge, wind is omitted also from GCMs.
A better approach when one is faced with numerous variables is to use neural networks. These can quickly and easily evaluate thousands of variables, and identify those that are of interest because they have a significant impact on the outcome. Modelers in the continuous process industries have done exactly this with good success.
My conclusion remains, though, that correlation models are not useful as predictors when they lack the correct variables.
There is an over-arching field of science that performs and publishes research into these matters, and that is Operations Research, with a website at http://www.informs.org/

Editor
April 25, 2010 9:04 pm

stevengoddard (12:37:38)

Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm

Man, I thought the “But CO2 correlates with temperature” argument had been relegated to the museums. Two points:
1. Correlation is not causation. The canonical example of this is two clocks that strike the hour. One is five minutes fast. They are perfectly correlated … but does that mean that clock A causes clock B to strike five minutes later?
For another example of why correlation is not causation, see point two.
2. What you say is true … but there is also a good correlation between a straight line and temperature. Here’s a comparison.

If the red confidence intervals overlap, then we cannot say that they are different. In other words, there is no statistical difference between the correlation of CO2 with temperature and the correlation of a straight line with temperature.
So perhaps you are impressed with the CO2 corrrelation. For me, something has to outperform a straight line before I pay any attention to it. Your claim about correlation is true … but it is as meaningless as claiming that temperatures will increase in a straight line because of that correlation.

We might expect to see an acceleration in temperature growth over the next century, coming in at the low end of current IPCC estimates and well below Hansen’s 1988 predictions.

People have been predicting an “acceleration in temperature growth” for a quarter century now. There has been no acceleration since Hansen made his predictions of imminent thermal doom in 1988. None.
Instead, what we’ve gotten is deceleration, there is no significant temperature change since 1995 … at what point are people going to notice that acceleration is not happening?

April 25, 2010 9:05 pm

AusieDan (20:40:09) :
Please feel free to make a plot of CET vs. CO2 to demonstrate your claim.

AC Adelaide
April 25, 2010 9:07 pm

If CO2 doesn’t cause warming, and in the absence of the Trenberth’s “hidden heat” that appears to be the case, then we are back to the status quo – which is what the mathematical model describes here.

April 25, 2010 9:09 pm

Willis Eschenbach (21:04:18) :
In order to have a “straight line” you need to have two axes. What are the two axes in your model?
The Temperature vs. CO2 plot shows very good correlation. You might argue over which one is the independent axis (cause and effect) but the correlation is quite good.

Roger Knights
April 25, 2010 9:15 pm

roger (13:08:57) :
Henry Galt
they are indeed unobtainable. This has occurred before, and usually happens when results fail to match up to their expectations. As it would now take a phenomenal rise in average temps for the rest of the year to offset the cool figures realised thus far, let alone to produce the new record annual temperature foretold in their fairy books, they have lost interest and are emulating the sun by having a grand sulk in the universe (sic) of Exeter.

Unfortunately, based on the GISS values (which are in-line with the UAH satellite-based values), temperatures for the first three months of the year have set record highs, and temperatures for the last half-year of 2009 was high as well, due to an El Nino. The GISS records can be viewed here:
http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt
The “anomaly” for calendar-year 2009 was .57, tieing it with 2007 as the 2nd hottest year. (2005 was the hottest.) The anomalies for the first three months of 2010 are 70 72 83; these high values need to be offset by a reversal of the Southern Oceanic Index. A reversal seems to be occurring, but whether it will reduce the anomalies of subsequent months sufficiently to lower the average for 2010 is uncertain.
As a result, the betting odds of 2010 being the hottest year on the temperature record are now 75% on https://www.intrade.com. (See under the “Climate and Weather” market. I “sold short” at 80%, so I’m ahead at the moment.)

geoff pohanka (12:42:36) :
Arctic ice concentrations are today the largest in nine years. Arctic ice grew until March 31st, the latest ever recorded. Arctic ice is thicker than in 1980. The northern hemisphere had one of the coldest and snowiest winters on record. Solar activity remains low, we have had nine consecutive days without sunspots. The oceans might soon move towards La Nina. The Katla volcano in Iceland might soon become active.
Yes indeed, I would expect the next ten years will determine if the theory of man made global warming is correct or not. In fact, I would take a wager on this one.

There are bets for that on Intrade (see link above).

Robert Kral
April 25, 2010 9:21 pm

Just skimmed this, but Figure 1 is the money figure. If you support that thoroughly, the rest is just background. In terms of public persuasion, the critical argument is “Look what they predicted compared to what actually happened. Now they want us to reorganize the global economy based on this level of predictive ability.”

David Ball
April 25, 2010 9:29 pm

stevengoddard (21:09:58) :”The Temperature vs. CO2 plot shows very good correlation. You might argue over which one is the independent axis (cause and effect) but the correlation is quite good”. Seems to me that Co2 follows temperature, and that is what we are seeing right now following the warming to ’98. Co2 has gone up , slightly lagging the temperature. The link to man has yet to be shown, IMHO.

Stirling English
April 25, 2010 9:45 pm

Ummm…is there any more to this than fitting a curve to some data? If so, I have missed it.
I vaguely remember from maths classes that there are a large number of polynomials that can fit any known set of data (maybe an infinite number??). Any reason to believe that this one is better than the others? Or that just curve-fitting has any form of predictive power? I
I’m as sceptical as the next man of the Climo’s claims, but this work does not seem to show much more than mathematical dexterity. It would take the RC guys no more that five minutes to destroy. Sorry.

Stirling English
April 25, 2010 9:50 pm

In case my previous remark is taken as completely negative. I agree with Robert Kraal above that the Fig1 graph is great….a good and immediate illustration of the way in which the Climos exaggerate their case and certainty. Its just the reliance on a purely mathematical method below that I dislike.
Way back I was a theoretician until the pesky experiments kept on not doing what I expected. So I learnt not to rely on clever ‘tricks’, but to make the theory fit the observations…not try to do things the other way round.

FrankK
April 25, 2010 10:01 pm

Interesting. A engineering approach. OK Its not going to work for ” long term- 100 year” predictions but it puts the data in perspective (even if the temp data is not all that correct).
Also I’d reckon its a much better bet on shorter term predictions ( 10 to 15 perhaps even 30 years) than any of the billion dollar models have come up with. So its got more credibility than the physically CO2 driven variety in my opinion.
Lets see what happens over the next 10 years.

DoctorJJ
April 25, 2010 10:07 pm

stevengoddard,
You said “The Temperature vs. CO2 plot shows very good correlation. You might argue over which one is the independent axis (cause and effect) but the correlation is quite good.”
That in combination with you plot of CO2 and temp has to be one of the most unintelligent things I’ve ever seen you post on here. You took 2 variables which were both know to be increasing and you plotted them together and, most importantly, you set the scale. Of course they correlate!!! I could do the same exact thing with temperature and the number of shoes in my wife’s closet. When you set the scales, the correlation is absolutley meaningless. In fact, it wouldn’t even have to be 2 quantities that were both increasing. You could reverse one of the scales and still show perfect correlation. I could replot your own graph, with your exact data yet show CO2 going off the chart and temperature remaining relatively flat. Or vice versa.

Girma
April 25, 2010 10:09 pm

Richard Telford (13:23:18)
You wrote,
“Your model is very useful. Extrapolating back in time to the early part of the last ice age, your model predicts temperatures a below absolute zero. I knew the ice ages were cold, but that is perhaps a tad excessive. Predicting forwards, we can determine how long it will be before the oceans boil”
Nowhere in my article have I claimed that the linear warming of 0.06 deg C per decade is a constant. It is like in calculus we assume a curve by consecutive small straight lines with varying slopes. At the moment, we are at one of this lines and it has a slope of 0.06 deg C per decade.

Editor
April 25, 2010 10:10 pm

stevengoddard (21:09:58)

Willis Eschenbach (21:04:18) :
In order to have a “straight line” you need to have two axes. What are the two axes in your model?
The Temperature vs. CO2 plot shows very good correlation. You might argue over which one is the independent axis (cause and effect) but the correlation is quite good.

Not sure what you mean, Steven. For simplicity, I took the correlation between the year of observation and temperature, but any straight line which is not horizontal would give the same answer.
Next, you say:

The Temperature vs. CO2 plot shows very good correlation.

So does the temperature vs. a straight line, it’s just as good … so what? If you hold that one is important, you have to hold the other is equally important.

Roger Knights
April 25, 2010 10:10 pm

The Ghost Of Big Jim Cooley (13:20:37) :
… you can’t just change it to the American spelling. It is Hadley CenTRE, not Center. Sorry, but we English get a little fed up about the corruption of the language.

You must read Lawrence Durrell’s amusing short story, “Case history,” in Esprit de Corps. Here is an extract to give you a hint:

“I remember now,” I said, “committing the terrible sin of using the phrase, ‘the present set-up’ in a draft despatch on economics.” (It came back gashed right through with the scarlet pencil which only Governors and Ambassadors are allowed to wield–and with something nasty written in the margin.)
“Ah,” said Antrobus, “so you remember that. What did he write?”
“‘The thought that members of my staff are beginning to introject American forms into the Mother Tongue has given me great pain. I am ordering Head of Chancery to instruct staff that no despatches to the Foreign Secretary should contain phrases of this nature.”
“Phew.”
“As you say–phew.”
“But Nemesis,” said Antrobus, “was lying in wait for him, old chap. … Polk-Mowbray was sent on a brief mission to the States in the middle of the war. He saw her leading a parade and twirling a baton. Her name was Carrie Potts. She is what is known as a majorette. I know. Don’t wince. … From then on the change came about, very gradually, very insidiously. …”
…………………..
[Until:]
“I saw him last week. … He was addressing a plate of spaghetti–and do you know what?
“No. What?”
“There was a Coca Cola before him with a straw in it.”
“Great heavens, Antrobus, you are jesting.”
“My solemn oath, old man.”
“It’s the end.”
“The very end. I tried to cringe my way past him but he saw me and called out.” Here Antrobus shuddered. “He said, quite distinctly, quite unequivocally, without a shadow of a doubt–he said, ‘Hiya!’ and made a sort of gesture in the air as of someone running his hand listlessly over the buttocks of a chorus girl. I won’t imitate it in here, someone might see.”
“I know the gesture you mean.”
“Well,” said Antrobus bitterly, “now you know the worst. I suppose it’s a symptom of the age really.”

Stephan
April 25, 2010 10:13 pm

OT but from solar 24 quote: “Solar Update – The spotless streak continues and now sits at 11 days in a row without a sunspot. Solar activity will continue at very low levels for the next 24 hours. ” So it looks like David Archibald’s prediction of SSN 40 max may turn out to be correct from current trends anyway

April 25, 2010 10:25 pm

Quoting from the paper again: “According to the Occam’s Razor principle, given a choice between two explanations, choose the simplest one that requires the fewest assumptions. Instead of applying the Occam’s Razor principle by assuming the cause of GMTA turning points to be natural, the IPCC assumed the cause to be man-made [9]:”
This is not quite what Occam’s Razon says. I submit the following discussion that I prepared for one of my essays.
“Occam (of Occam’s razor) expresses a principle; he is an English logician and Franciscan friar (William of Ockham, 1285-1349 CE). Unfortunately I will not be able to meet with him. The principle states that the explanation of any phenomenon should make as few assumptions as possible, eliminating those that make no difference in the observable predictions of the explanatory hypothesis or theory. The principle is often expressed in Latin as the lex parsimoniae (“law of parsimony” or “law of succinctness”): “entia non sunt multiplicanda praeter necessitatem”, (roughly translated to English from Latin as: “entities must not be multiplied beyond necessity”. This is often paraphrased as “All other things being equal, the simplest solution is the best.”) In other words, when multiple competing theories are equal in other respects, the principle recommends selecting the theory that introduces the fewest assumptions and postulates, the fewest entities.”
That does not mean the statement is incorrect just not quite right. I suggest it be slightly reworded. Since the IPCC’s projections and the models used are based on a false primise they are not equal in other respects. Assuming for argument they are equal in other respects then Occam when applied ….

April 25, 2010 10:26 pm

Willis Eschenbach (22:10:02) :
You seem to be trying to make an argument that all linear relationships are meaningless, because they are straight. That is kind of weird.
The CO2 vs. time plot is definitely not a straight line.

Girma
April 25, 2010 10:30 pm

David Mayhew (12:44:50)
You wrote:
Climate Research Unit (CRU) of the Hadley Center”
Surely these are separate institutions ? The CRU is part of the University of East Anglia and the Hadley Centre is part of the UK Meteorological Office.
My source was http://www.woodfortrees.org which states:
#Data processed by http://www.woodfortrees.org
#Please check original source for first-hand data and information:
#
#—————————————————-
#Data from Hadley Centre / UEA CRU
#http://www.cru.uea.ac.uk/cru/data/temperature/
#For terms and conditions of use, please see
#http://hadobs.metoffice.com/hadcrut3/terms_and_conditions.html
#—————————————————-
#
#File: hadcrut3vgl.txt
http://www.woodfortrees.org/data/hadcrut3vgl/compress:12

Editor
April 25, 2010 10:30 pm

Roger Knights (21:15:47)


Unfortunately, based on the GISS values (which are in-line with the UAH satellite-based values), temperatures for the first three months of the year have set record highs, and temperatures for the last half-year of 2009 was high as well, due to an El Nino. The GISS records can be viewed here:
http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt
The “anomaly” for calendar-year 2009 was .57, tieing it with 2007 as the 2nd hottest year. (2005 was the hottest.)

Ummmm … no. The UAH data is definitely not “in-line with” GISS. Here’s the UAH values …
Year, Temp, Rank
1979, -0.07, 24
1980, 0.09, 14
1981, 0.06, 16
1982, -0.15, 28
1983, 0.04, 20
1984, -0.26, 31
1985, -0.21, 30
1986, -0.15, 26
1987, 0.11, 11
1988, 0.11, 13
1989, -0.11, 25
1990, 0.08, 15
1991, 0.12, 10
1992, -0.19, 29
1993, -0.15, 27
1994, -0.01, 23
1995, 0.11, 11
1996, 0.02, 22
1997, 0.05, 18
1998, 0.52, 1
1999, 0.04, 19
2000, 0.04, 20
2001, 0.20, 8
2002, 0.32, 3
2003, 0.28, 5
2004, 0.20, 9
2005, 0.34, 2
2006, 0.27, 6
2007, 0.29, 4
2008, 0.05, 17
2009, 0.26, 7
As you can see, 2005 is almost two tenths of a degree cooler than 1998, which is the warmest in the UAH data. And 2009, far from being tied for second as GISS says, was seventh in the UAH record … not in-line with GISS in any sense.

April 25, 2010 10:36 pm

Girma,
I did a search and found this: click
The comments following that article are also interesting.
The level of attacks from the alarmist contingent show that you are on to something very important. The next few years will show us whether these fluctuations are normal, natural and routine, and are well within past natural climate parameters, or if they are the result of the small [≈3% human fraction of the trace gas CO2], in which case they must perforce exceed the previous temperature parameters.
I think that as always in the past, the climate will revert to its long term trend line, thus falsifying the CO2=CAGW hypothesis.

April 25, 2010 10:45 pm

Maybe this chart makes it more clear (GISS anomaly * 100) + 225 vs. CO2
https://spreadsheets.google.com/oimg?key=0AnKz9p_7fMvBdE9rZ3lzMHRRaGxUb3JHRXZfU0daeWc&oid=2&v=1272260546887

Girma
April 25, 2010 10:56 pm

Leif Svalgaard
“John Cooke (13:13:35) :
As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.
I agree, this is just curve fitting a posteriori. But so is so much that goes for ‘science’ these days.”
Mathematicians seek out patterns. Based on the observed GMTA data, the pattern is a combination of linear and sinusoidal functions. As this pattern was valid for the last 129 years, it is reasonable to assume it will be valid for the next 20 years.
Otherwise, how are you going to tell me whether our globe is going to have further warming or cooling in the coming 20 years?

Mooloo
April 25, 2010 11:00 pm

Ockhams razor applies – if the shown correlation is as good as the elaborate forcing ones then it is better, not worse.
Insisting on the need for every possible influence is idiotic in many instances. Tides can be quite easily predicted using past patterns. That we now know how they work isn’t actually that much use (though interesting). There are so many confounding factors that past patterns is actually the best method, adjusted by measurement after the fact.
Trying to work out tides from first principles would be idiotic. Almost as idiotic as trying to work out all the confounding factors in climate.

Editor
April 25, 2010 11:09 pm

stevengoddard (21:05:22)

AusieDan (20:40:09) :
Please feel free to make a plot of CET vs. CO2 to demonstrate your claim.

Same problem as above, but the correlation is much, much worse.

April 25, 2010 11:10 pm

Grima: You have done some first class work here. Lots of good comments from others as well. I strongly suspect the AGW people will attack with all mouths blazing. Since they will have trouble attacking the message they will surly attack the messenger. That is why my concerns as expressed in the other posts. They may seem picky and in the blog world probably are. The thing needs to be so clean that few if any science or logic based arguments can be made against it. Make sure all logic is deductive. The language and semantic implications must be very tight. The reference list as solid and extensive as possible. And remember, models do not produce data, only results. I have written a number of short essays looking at different aspects of the Philosophy of Science. Some of them may be helpful reminders. They are at: http://retreadresources.com/blog

Editor
April 25, 2010 11:14 pm

stevengoddard (22:26:39) : edit

Willis Eschenbach (22:10:02) :
You seem to be trying to make an argument that all linear relationships are meaningless, because they are straight. That is kind of weird.
The CO2 vs. time plot is definitely not a straight line.

Please quote where I made anything like that argument. My point is that a model of a varying phenomenon like temperature should outperform a straight line.

WAM
April 25, 2010 11:26 pm

@Girma
Sorry to say again. You have assumed some model simple model (linear trend and harmonic component), based on very weak assumptions pointed by contributors to the discussion (you ommited known long-period components).
Next you did linear LSQ fit to the available data.
Your model can describe (very roughly) the time period used for the approximation. You cannot claim anything for the future.Even for 10 years (eg. compare IPCC linear trend predictions based on years 1980-2000, as for today’s plateau). Reason is unit root and spourious fit (so no way to say anything about your coefficients in terms of “goodness of fit”).
Again, the time series modelling is much more advanced nowadays, read dr Stockwell blog and VS comments (plus long discussion by VS on this subject on Bart’s blog).
And how complex the modelling of temperature might be (due to underlying physical and meteorological processes involved) have a look at book by prof. Marcel Leroux.
And your model just tells us:
the system has constant linear secular term and it is an undamped linear oscillator (“mass-spring” system) – how can you identify and name processess giving such a model (and behaviour).
Sorry, but it looks like an exercise from EXCEL I used to give students when teaching them SOLVER use (finding the period of sinusoidal component and the lag in periodic term).

John A
April 25, 2010 11:30 pm

As far as I can see, this is not really a model, but a mathematical fit to the data. For a physicist, that’s not really a model.

I completely agree. I think the temptation to think that “the future is a continuation of the present trends” is at the root of the global warming scare and pretty much every apocalyptic movement there has ever been.
There have been a lot of posts on WUWT which claim this or that model has predictive skill but none of them have been particularly impressive or compelling because there is no insight into the underlying forcings or inherent variability of the climate.
They are guesses dressed up as models.
In other news, the Sun has gone into hibernation again – 10 days without sunspots and none on the horizon. I wonder if we’re about find out how much the solar cycle really affects earth’s climate and if so, it won’t be pleasant.

April 26, 2010 12:06 am

The main problem is, that author tries to adjust his model on wrong HadCRUT, which got its early part statistically cooled down, 1940 warming blip partially reduced/removed and modern period positively affected by UHI and selective station use.
In reality, present decade is just a bit warmer than 40ties and 80ties were almost as cold as 1910 minimum. It is visible on all NH long-term records like Armagh Observatory, even on UHI-affected US record or CET.

thelastpost
April 26, 2010 12:25 am

Anticlimactic (19:43:19) :
“The skeptics viewpoint is mostly that climate is driven by the sun”
Um .. no. Which skeptic? (not this one) Not-CO2 can mean other things than the sun. The sun is one factor among quite a number.

HAS
April 26, 2010 12:42 am

Willis Eschenbach and stevengoddard (not to mention Girma Orssengo)
You should read http://landshape.org/enm/testing-beenstock/ and http://landshape.org/enm/orders-of-integration/ before bothering to take this conversation any further and explain why this doesn’t apply.
If the correlation is spurious why would you bother drawing any inference from it?

Tenuc
April 26, 2010 12:53 am

davidmhoffer (20:04:20) :
“…Itz a chaotic system, the possibility of including and correctly modeling ALL variables seems slight. What we need is to understand what the DOMINANT variables are. Since it is a chaotic system, trying to start by making a list of all the variables that we can think of, hope we didn’t miss any, and build a model from there seems futile.
Does it not make sense to evaluate the data for repetitive patterns and then determine if they provide clues as to which processes are dominant?…”

As our climate is driven by deterministic chaos, I agree that looking for repetitive patterns (quasi-cycles) is a useful approach to finding the strange attractors which dictate itz behaviour.
However, moving from this stage to building a predictive model is still fraught with difficulty. We do not have sufficiently accurate climate data of a suitable spacial and temporal granularity to make long-term models viable.

April 26, 2010 1:56 am

Bob Tisdale (15:55:54) :
Thanks Bob – that is a wonderful resource isn’t it. When you are protected by dykes it behoves one to be considerate of nature. Hadley Met couldn’t give a toss it would seem.
Thanks to all who answered my plea.

Mike Haseler
April 26, 2010 2:01 am

Dear Wam,
you contribution goes along the IPPC “mankind caused it lets find out how”, rather than the scientific: “does the evidence cause us to reject the null hypothesis of random variation”?
To put it bluntly, the climate “scientists” who work on this stuff are just out of their depth and haven’t a clue how to deal with a real world complex system like this.
As for the author, unfortunately they have fallen into the same trap as the IPCC. 1/f type Random noise has strong low frequency noise components. Because many people are unfamiliar with this kind of noise, they tend to assume they can just transpose interpretation techniques from simple white noise and get a meaningful result. In fact pure 1/f type noise has random components that appear to “trend” or “cycle” in a short sample like this. Not only does this make it extremely difficult to distinguish between noise and signal, it also makes it extremely easy to confuse random fluctuation with “something”.
In a century (if the economic collapse at the end of oil still means we have universities – joke – I hope!) academics will look back at this period and wonder why so much spurious writing occurred on a subject that in retrospect will seem so simple: the world climate varies. If any is reading this from the future, can I first say: “hi” and second I want to say: “look it may seem obvious to you in the future, but sitting here without the end of the graph it sure isn’t obvious it is noise!”.

Roger Knights
April 26, 2010 2:50 am

Willis Eschenbach (22:30:48) :

Roger Knights (21:15:47)

Unfortunately, based on the GISS values (which are in-line with the UAH satellite-based values), temperatures for the first three months of the year have set record highs, and temperatures for the last half-year of 2009 was high as well, due to an El Nino. The GISS records can be viewed here:
http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt
The “anomaly” for calendar-year 2009 was .57, tieing it with 2007 as the 2nd hottest year. (2005 was the hottest.)

Ummmm … no. The UAH data is definitely not “in-line with” GISS. Here’s the UAH values …
Year, Temp, Rank
1979, -0.07, 24
…..

I was referring (I guess I should have been more explicit) only to “temperatures for the first three months of the year,” during which period the UAH temperatures have also risen sharply, along with GISS.

April 26, 2010 3:06 am

Knights (21:15:47) and @ Willis Eschenbach (23:09:06) :
I see, the bets are at 77% for 2010 to become the hottest year ever. That’s quite possible if the super dry El-Nino goes on and the sun heats up the Pacific (Lower cloud cover was seemingly higher in 1998, has anybody updated data?).wThen the CO2 line will be above 90% fit and the straight line goes way down, because global warming is indeed accelerating since 1850 albeit at a smaller rate. See
Hadcrut3-global-land-ocean-index.
Now, assuming that correlation is causation of all the warming, this would indeed mean a direct climate sensitivity of 2K for 2xCo2 which according to Arthur Smith is equivalent to 3K equivalent sensitivity and thus the model average of IPCC. See monckton-smith-controversy.
May I suggest to give it a voting tie between the discussed simple model and the IPCC. However, a bit of chilly cloud ionization by Svensmark’s stars and/or a nice volcano will do, and we may find ourselves back in 1970’s conditions.
Regards, Climatepatrol

Grumbler
April 26, 2010 3:15 am

“The Ghost Of Big Jim Cooley (13:20:37) :
Sorry, I’m not one to usually nitpick, but the Hadley Centre is a noun (proper). So it CANNOT be Hadley Center – you can’t just change it to the American spelling. It is Hadley CenTRE, not Center. Sorry, but we English get a little fed up about the corruption of the language. We accept other countries adapting it, but not when they’re stating something that’s actually here. Hence, the ‘British Tyre Company’ could never be ‘The British Tire Company’.”
Hear, hear. Next thing you know they’ll be calling the Revoutionary War the War of Independence! 😉
cheers David

Chris Wright
April 26, 2010 3:24 am

stevengoddard (12:37:38) :
“Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm
Whoever made these plots was careful not to apply any filtering. The dots have a large random component that will tend to give the impression of a straight line. If you use, say, a twelve month rolling average, then the dots form a more realistic pattern that looks virtually identical to the standard graphs with temp plotted against time.
It then becomes obvious that there is essentially no correlation at all. All you can say is that both temp and CO2 went up in the 20th century – but then, so did a lot of things.
I would class these graphs as a confidence trick. Apart from anything else, if you plot one variable against another, it’s pretty meaningless unless you can keep all other variables constant, as you would try to do in a laboratory experiment.
Chris

April 26, 2010 3:26 am

Stephan (22:13:22) :
“OT but from solar 24 quote: “Solar Update – The spotless streak continues and now sits at 11 days in a row without a sunspot. Solar activity will continue at very low levels for the next 24 hours. ” So it looks like David Archibald’s prediction of SSN 40 max may turn out to be correct from current trends anyway”
We had SSN 70+ in February, and a few years yet till maximum. Expect some very strong solar activity till 2013, and a corresponding rise in temperatures. The risk is also high for damaging solar storms for C24, as there are up to 10 times as many of these events through the maximums following quieter minimums. Archibald is just wiggle matching, like everyone else who is predicting a decline in world temepratures till at least 2030, watch them go very quiet over the next few years of warming! Successfull forecsasting can only be acheived by an understanding of the causation, and not assumptions about perceived trends. I would be very happy to put down money now on 2025 to 2038 being very warm.

dr.bill
April 26, 2010 3:50 am

Mike Haseler (02:01:57) :
….. In a century (if the economic collapse at the end of oil still means we have universities – joke – I hope!) academics will look back at this period and wonder why so much spurious writing occurred on a subject that in retrospect will seem so simple: the world climate varies. If any is reading this from the future, can I first say: “hi” and second I want to say: “look it may seem obvious to you in the future, but sitting here without the end of the graph it sure isn’t obvious it is noise!”.

As an academic writing from the future, thanks for the “hi”. As for your other comment, it’s only a few hours into the future, and we don’t know the answer yet. ☺
/dr.bill

Alan Millar
April 26, 2010 4:03 am

Girma (22:56:56) :
“Mathematicians seek out patterns. Based on the observed GMTA data, the pattern is a combination of linear and sinusoidal functions. As this pattern was valid for the last 129 years, it is reasonable to assume it will be valid for the next 20 years.
Otherwise, how are you going to tell me whether our globe is going to have further warming or cooling in the coming 20 years?”
That’s the trouble with assuming that this curve fitting has any predictive value and for how long if it has.
What do we absolutely know to be true about this ‘model’?
It didn’t correlate (work) looking backwards for any significant time. We haven’t seen sub zero average temperatures on the Earth very much!
We know it will stop correlating (working) some time in ther future. The Earth is not going to melt any time soon!
However, for its predictive value, when is it going to fail? How do we know it is not tomorrow? How do we know it wasn’t yesterday for that matter?
That’s the trouble with curve fitting without a clear understanding of all the significant processess involved, they don’t have much predictive value and the longer you run them the less value they have.
Just like the current GCMs actually!
Alan

April 26, 2010 4:52 am

I did a very similar analysis in 2008 by using sine waves and solved for best RMS error, and reached very similar conclusions, except I used PDO, AMO and CO2. My R^2 was 0.89, and like yours, I projected max cooling at year 2028.
The influence of CO2 was in agreement with studies using negative feedback (almost zero influence). I don’t remember R^2 but I think it was around 0.2 once other factors are eliminated.
I’ll try to dig out that analysis and post the results.
I understand the few comments regarding physics, but honestly, if $70Billion isn’t enough to even remotely understand the physics, isn’t showing that natural factors dominate the correct first step? These cycles may have chaotic components, but since they do seem repeatable over the short term, I think they are highly useful and provide a much more plausible predictive ability than anything I’ve seen come out of the IPCC.

davidmhoffer
April 26, 2010 4:57 am

Tenuc
However, moving from this stage to building a predictive model is still fraught with difficulty. We do not have sufficiently accurate climate data of a suitable spacial and temporal granularity to make long-term models viable.>>
No. But it seems we have enough for short term (10 to 20 years). As the article author points out, his model is a fit for the last 129 years. I only started looking into climate in detail a few months ago, but the 60 year (give or take) cycle jumped out at me a long time ago, suggesting a peak about now followed by a cooling trend for a decade or two. With ice extent increasing, ocean hot content dropping, land temperatures flat, solar activity down, etc etc etc it seems like that is exactly what is happening. Does this mean the author’s model is valud for the next century or two? No. Does it mean that climate is far less connected to CO2 than other factors? Looks that way. Does it mean that decisions aren’t urgent, stakes high after all? Looks to me like the trend of the last 129 years continues and the only urgency is that we stop panicking about what “we” are doing to the climate, calm down, and start studying what the climate is actually doing and why. There may in fact be reason to panic for all I know, but the dominant factors if there are any will be something other than CO2. Would be nice to know what they are, and which direction they are going to go a couple of decades from now.

FrankK
April 26, 2010 5:07 am

Grumbler says:
“Sorry, but we English get a little fed up about the corruption of the language.”
Logically Grumbler, Centre should of course be pronounced Centree
and what about your lieutenant pronounced Leftenant ??? (from M English levetenant) Yet you say “in lieu” of something (LOL). Sorry buddy not logical and a corruption (get up to date) and I am not English or American.
Back the subject:
Some have said this is not a model. I disagree its for me an analytical model as opposed to a numerical model and probably quite useful. It just needs a few years of validation.

Editor
April 26, 2010 5:12 am

Quite a bit of work, but Akasofu’s work isn’t in the references. Hasn’t that reached India yet? His model is a continuing recovery from the Little Ice Age, linear over the time range considered, and a multi-decadal oscillation that is not a pure sinusoid, like the PDO.
More references to the mix:
http://wattsupwiththat.com/2009/03/20/dr-syun-akasofu-on-ipccs-forecast-accuracy/
http://people.iarc.uaf.edu/~sakasofu/pdf/two_natural_components_recent_climate_change.pdf
http://people.iarc.uaf.edu/~sakasofu/
It would be nice if Orssengo writes an addendum comparing his model with Akasofu’s. One thing I really like about Akasofu’s model is the two components. As one includes more and more components, you can adjust the weightings of each and describe almost anything. However, the result often has little predictive power, as many Wall Street investors have discovered.
From http://www.sciencebits.com/FittingElephants:

A quote attributed John von Neumann (at least so it was told by Enrico Fermi to Freeman Dyson) sums up the first part of the story: “Give me four parameters, and I can fit an elephant. Give me five, and I can wiggle its trunk“.”

One of William Gray’s changes to his hurricane predictions has been to reduce the number of parameters. I think he noted that as they found dependencies between them, they could replace them with predictors closer to the important parameters.

Editor
April 26, 2010 5:26 am

stevengoddard (12:37:38) :

Both HadCrut and GISS show a good correlation between CO2 and temperature.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm
We might expect to see an acceleration in temperature growth over the next century, coming in at the low end of current IPCC estimates and well below Hansen’s 1988 predictions.

Joe D’Aleo found a better correlation with “PDO And Solar Correlate Better Than CO2.”
http://wattsupwiththat.com/2008/01/25/warming-trend-pdo-and-solar-correlate-better-than-co2/
So we might see a decelleration in temperature growth over the next 30 years and that matching to the accelleration during the last warm PDO raised more handwring than was warranted.
If CO2 does have an effect based on the logarithm of its concentration (per the approximation of it absorption curve), and if CO2 is increasing exponentially, then we can’t expect much more than a linear increase due to CO2.
CO2’s increase is more like an exponential added to a constant baseline, which would translate to an acceleration, but given CO2’s relatively poor performance of late, and the uncertainities (curse you Leif!) of solar effects, my expectation is with Akasofu.

Pascvaks
April 26, 2010 5:35 am

A thought came to me after scanning through all this. It doesn’t relate to anything anyone said, as far as I recall.
‘Scientific and Technical papers should be published using pen names, with no reference to the author’s identity or education background.’
No need to applause.

John G
April 26, 2010 5:49 am

It is a model of sorts not just a curve fit. It assumes some of the force driving climate is cyclical in nature and some is a secular upwards force. It seems to me that since it fits the observations better than the models based on the idea that CO2 is causing the warming then the CO2 model is discredited.

April 26, 2010 5:49 am

All the Y axis equals 1°C. HADCRUT 3 disappears if Y axis would be considered in 1°C increases (less than 1°C cannot be perceived by any of us).

len
April 26, 2010 6:05 am

foward this to tom friedman at the new york times. he’s still pushing the carbon tax hustle.

VictorianModesty
April 26, 2010 6:23 am

An interesting article, but rather than rehash some of the main points brought up by other posters, I am going to dwell on three. 1) IPCC is not ‘just concerned about global warming’. It’s climate change; they are concerned about erratic and severe changes in climates around the world, including but not limited to cooling in tropical areas, loss or change in dominant vegetation stands and changes in natural cycling. The media usually likes to portray only ‘GLOBAL WARMING’ because it has been a catch phrase since the mid 1990s and it is a lot easier for a lay person to grasp the basics and remember, versus ‘OSCILLATING OCEAN CURRENTS’ or ‘DENDROCLIMATOLOGY’ or ‘DESERTIFICATION’. 2) The results and discussion are interesting, but one of the cons of shorter articles is a lack of methods. How many times were the simulations ran for the above results? How many data sets were used, i.e. was it a data set from 2001 or were the up dated sets from 2002 -present included as well? Did the analysis incorporate results for the st. deviations, mode, medians? Was any data removed as an outlier and why? 3) I read through this a couple times, was the model set up in microsoft excel? “In the above equation, the factor 2*3.1416 is used to convert the argument of the cosine function to radians, which is required for computation in Microsoft Excel. If the angle required is in degrees, replace 2*3.1416 with 360”. I hope the model was created on software OTHER than Excel. Excel is a great preliminary tool and keen for simple stats. But the sheer number of variables involved in this topic cannot feasibly be incorporated and simulated in Excel without error. I am willing to bet scientists under IBCC and other statistical institutions are not publishing their results in peer edited journals using Excel. Any thoughts on this?

tim
April 26, 2010 6:27 am

The only suggestion I could make to this incredible site is: please recognise readers are not all academics or scientists. A “conclusion for general information”, or something similar, would help us to get a precis…and maybe distribute the good work to the media in a popular and easily digestible format.

Gail Combs
April 26, 2010 6:29 am

stevengoddard (21:09:58) :
Willis Eschenbach (21:04:18) :
In order to have a “straight line” you need to have two axes. What are the two axes in your model?
The Temperature vs. CO2 plot shows very good correlation. You might argue over which one is the independent axis (cause and effect) but the correlation is quite good.
________________________________________________________________________________
Of course the correlation is good. Approximately 70 percent of the planet is ocean, and the average depth is about 1,000 meters. CO2 is more soluble in cold water and less soluble in warm water. The more the oceans warm the higher the amount of CO2 that out gases. Also as it warms and CO2 increases the biosphere revs up. Plants expire CO2 at night, as well as animals and then there is all those microbes decaying bio matter and emitting CO2. (think beer)

kwik
April 26, 2010 6:36 am

kwik (15:06:14) :
“Isnt this post missing a plot on how the formula is showing the temperature to unfold, say, the next 10 years?”
Okay, found it in figure 3.Sorry.
Well, after reading it once more, slowly, my comment is;
This is the way a guy dealing with electronics, process control, spectral analysis etc would do it.
It makes common sense? Cannot see any problem with it.
Of course noone knows when that sloooow increase will flatten out, and why. In any case we cannot do anything about coming out of the last ice age.Just be happy for every day passing by where that slow slope doesnt start turning downhill again.
Only problem for the politicians is that the slowly increase out of the last ice age cannot be taxed. Who to tax? The galaxy?
A great post, and many interesting comments!

davidmhoffer
April 26, 2010 6:43 am

VictorianModesty (06:23:17) :
1) IPCC is not ‘just concerned about global warming’. It’s climate change; they are concerned about erratic and severe changes in climates around the world, including but not limited to cooling in tropical areas, loss or change in dominant vegetation stands and changes in natural cycling.>>
BULL. They have been hyping global warming, screaming about runaway temperature increases, hollering about tipping points and demanding extreme taxation systems to limit the root cause which they say is CO2. What part of the last 20 years of their tripe sounds like “climate change”? Only in the VERY recent few months, faced with how ridiculous their claims have been, have they started to reposition as “climate change”.
VictorianModesty (06:23:17) :
The media usually likes to portray only ‘GLOBAL WARMING’ because it has been a catch phrase since the mid 1990s and it is a lot easier for a lay person to grasp the basics and remember, versus ‘OSCILLATING OCEAN CURRENTS’ or ‘DENDROCLIMATOLOGY’ or ‘DESERTIFICATION’.>>
BULL. It is a scare tactic to justify taxation and an excuse to NOT discuss the science, in fact to IGNORE the science.
VictorianModesty (06:23:17) :
2) The results and discussion are interesting, but one of the cons of shorter articles is a lack of methods. How many times were the simulations ran for the above results?>>
LOL. It is a MATH model. 2+2=4 and it doesn’t matter HOW many times you run it, you get the same answer.
VictorianModesty (06:23:17) :
How many data sets were used, i.e. was it a data set from 2001 or were the up dated sets from 2002 -present included as well? Did the analysis incorporate results for the st. deviations, mode, medians? Was any data removed as an outlier and why? 3) I read through this a couple times, was the model set up in microsoft excel? “In the above equation, the factor 2*3.1416 is used to convert the argument of the cosine function to radians, which is required for computation in Microsoft Excel. If the angle required is in degrees, replace 2*3.1416 with 360″. I hope the model was created on software OTHER than Excel. Excel is a great preliminary tool and keen for simple stats. But the sheer number of variables involved in this topic cannot feasibly be incorporated and simulated in Excel without error.>>
The answers to your questions are pretty much in the article but that last piece is interesting. What mathematical error is it that you think Excel introduces? Can you demonstrate this? Have you told Microsoft?
VictorianModesty (06:23:17) :
I am willing to bet scientists under IBCC and other statistical institutions are not publishing their results in peer edited journals using Excel. Any thoughts on this?>>
How much are you willing to bet? Not that this invalidates the use of Excel even if it was true. You may as well complain that slide rules aren’t accurate enough for science and that an abbaccus can’t be used for math. Either the numbers are right or they are wrong and it makes zero difference what they were calculated with. But put some money where your mouth is, I’m a bit short this month.

dr.bill
April 26, 2010 6:45 am

VictorianModesty (06:23:17) :
….. Any thoughts on this?

Yes, I have a thought: Get over yourself!
Any tool adequate for the purpose is adequate for the purpose. There’s nothing wrong with Excel (at least nothing that Open Office can’t fix), and most of the world’s climate data has been processed with Fortran, which has been around in one form or another since the 1950’s.
/dr.bill

April 26, 2010 7:00 am

Chris Wright (03:24:11) :
There are no tricks in the plots. They are straight up Temperature vs. CO2 and show good correlation. The fact that you don’t like the results doesn’t invalidate them.

After 7 iterations the fit converged.
final sum of squares of residuals : 12070.5
rel. change during last iteration : -4.17129e-06
degrees of freedom (FIT_NDF) : 127
rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 9.74901
variance of residuals (reduced chisquare) = WSSR/ndf : 95.0431
Final set of parameters Asymptotic Standard Error
======================= ==========================
a = 0.641343 +/- 0.02305 (3.594%)
b = -927.309 +/- 44.84 (4.836%)

April 26, 2010 7:07 am

Ric Werme (05:26:49) :
Joe’s analysis plotted USHCN vs. CO2, not global temps. Global temps correlate much better.
There are four possibilities concerning the GISS/CRU vs. CO2 graphs
1. Coincidence. Not very likely.
2. CO2 drives temperature
3. Temperature drives CO2
4. Temperature data has been adjusted to fit the CO2 data
One or more of the last three are true, and can’t be ignored.

Girma
April 26, 2010 7:32 am

Is it possible to insert a graph in a post? How?

MikeP
April 26, 2010 7:34 am

Steven,
You’re forgetting
5. Both CO2 and Temperature have a common driving factor which causes them both to vary somewhat together.
and
6. Some combination of the above.

April 26, 2010 7:44 am

tim (06:27:46) :I will digest it for you:For example, the above IPCC graph where you, at first sight, see a supposed temperature increase it is a graph intended to cheat, why?, because nobody feels differences of a tenth of a degree celsius, and if differences of a tenth of a degree are considered chances are these are due to error readings. Then, if that graph you adjust it for real human perception, say a degree per degree celsius scale, there won´t be any differences: neither a decrease in temperature nor an increase. What does it mean, then?. It means this is not about science of any kind whatsoever, it is about politics and politics´eternal companion: money making, money stealing from daily working people by cheaters. That graph is plain, demagogics intended to fool common hard working people like the majority of us, who do not know , other way of getting money but by working hard and even in the case that we should know how we could not do it because of of our moral principles (they don´t have any). You now know the real truth about “climate change”, “green policies”, “recycling”, etc.etc.

MartinGAtkins
April 26, 2010 8:57 am

stevengoddard (19:15:10)
LOL
Here’s my new improved version of your graph,:-} I did a normal chart to get the HadCRUT trend. The dates are chosen because my CO2 data only goes back that far.
HadCRUT 1958/3-2009/11 621 months trend = 0.65.
CO2 1958/3 = 316ppm, 2009/11 = 386ppm. CO2@ 621 months, Gain 386-316 = 70ppm.
0.65/70 = 0.00928571 deg C per 1ppm CO2.
http://i599.photobucket.com/albums/tt74/MartinGAtkins/CO2-Temp3.png
1958 was a cool period so 0.00928571 C per 1ppm CO2 is probably a little high.

MartinGAtkins
April 26, 2010 9:22 am

VictorianModesty (06:23:17) :
An interesting article, but rather than rehash some of the main points brought up by other posters, I am going to dwell on three. 1) IPCC is not ‘just concerned about global warming’.
Could you spare us all the BS and dwell on the topic.

April 26, 2010 9:31 am

MartinGAtkins (08:57:05)
You came up with the same numbers I did. At 550 PPM, the temperature increase is 1.5C above the present.

Steven mosher
April 26, 2010 9:53 am

stevengoddard (07:00:14) :
Thanks for your contribution:
I think the question can be put simply:
Its 1850. If you are given 1 variable to predict the temperature in 2010, what variable would that be and what would your model look like.
1. If you said “natural variability” rules (willis’ null) you’d just throw up your hands and say ” cant know, cant predict”
2. if you said ” follow the sun” and picked TSI as your variable, you’d do horribly.
3. If you picked: C02 concentration you’d do nicely. not perfect of course
And what would the second variable be?
A version of this was done a while back.. Lucia’s lumped parameter model. It works quite well as a simple model.

richcar 1225
April 26, 2010 10:11 am

This article clearly challenges IPCC predictions for surface temps but I really think the debate needs to be moved to the oceans. Surface temps are influenced by not only UHI and improper adjustments but weather and of course release of heat from the oceans which is influenced by ENSO as well as ‘hiding heat’. The surface data is so sparsely located after a hundred years what can it really represent. I also think that ca/mg dating will eventually allow us to tie the modern ocean temperature and OHC record with the past much more accurately than atmospheric proxies like tree rings. Ninety per cent of the joules are in the ocean. I see near panic in the warmists response when asked where the missing heat is.
I predict ARGO will bring the entire movement down.

Editor
April 26, 2010 10:38 am

VictorianModesty (06:23:17) :
I hope the model was created on software OTHER than Excel. Excel is a great preliminary tool and keen for simple stats. But the sheer number of variables involved in this topic cannot feasibly be incorporated and simulated in Excel without error. I am willing to bet scientists under IBCC and other statistical institutions are not publishing their results in peer edited journals using Excel. Any thoughts on this?
—…—…—
An recent audit of the IPCC’s latest report shows that 40+ percent of their OWN IPCC document is NOT peer-reviewed in ANY journal, and as much as half of the articles cited in certain sections are merely press releases from the WWF, Greenpeace, and similar international propaganda outfits(both tax exempt charities and tax-funded NGO’s) not accountable to ANYONE for any accuracy in their documents.
As stated above, simple programs are good enough to show 2+2*(cos(60x)) = 4. There are only two variables used: Time and Global Temperature.
I would prefer the author use the slightly more complex form “2 + 2*(cos(60x)) + 2*(cos(900x)) = 4 …. 8<) but I digress.
I would strongly disagree with HadCru's massively interpolated and approximated and averaged and "spread out" "global temperatures", but – after all – the IPCC believes they are accurate enough to justify destroying the worlds' free market economies!
The IPCC's bureaucrats and "scientists" are funded with billions to explicitly produce global warming hysteria to justify trillions in economic penalties and taxes. But the IPCC supporters use ancient, undocumented, untested programs and software routines notorious for sloppy, unprofessional, and completely false outcomes. Be advised that a simple Excel routine IS a welcome change of pace from Hansen's, HadCRU's and Mann's sloppy trash and false numerology.

Editor
April 26, 2010 10:43 am

Pascvaks (05:35:26) :
A thought came to me after scanning through all this. It doesn’t relate to anything anyone said, as far as I recall.
‘Scientific and Technical papers should be published using pen names, with no reference to the author’s identity or education background.’
No need to applause.
—…—
Rather,
‘TAXPAYER-FUNDED Scientific and Technical papers [and referee comments] SHOULD be published using full names, with ALL references INCLUDING the author’s identity AND education background AND email addresses and ALL data and ALL programs FULLY AVAILABLE to the world.’

Dave
April 26, 2010 10:47 am


I always find it funny when someone accuses the Intergovernmental Panel on *Climate Change* of only recently starting to shift its case towards Climate Change.

bob
April 26, 2010 11:06 am

Just which Hadcrut3 did he use, as Hadcrut3 has been above 0.5 for the annual mean since 2001.
from here
http://www.cru.uea.ac.uk/cru/data/temperature/crutem3gl.txt
Which puts it right in the mix of all the models he shows on the chart.
If I can do that, what would Realclimate do?

Tenuc
April 26, 2010 11:13 am

davidmhoffer (04:57:22) :
[…Tenuc
However, moving from this stage to building a predictive model is still fraught with difficulty. We do not have sufficiently accurate climate data of a suitable spacial and temporal granularity to make long-term models viable…]
“No. But it seems we have enough for short term (10 to 20 years). As the article author points out, his model is a fit for the last 129 years. I only started looking into climate in detail a few months ago, but the 60 year (give or take) cycle jumped out at me a long time ago, suggesting a peak about now followed by a cooling trend for a decade or two. With ice extent increasing, ocean hot content dropping, land temperatures flat, solar activity down, etc etc etc it seems like that is exactly what is happening…”
I think you are right, the 60(ish) year cycle is significant and has happened too often in the past to be a fluke, so maybe using Girma’s model to predict a couple of decades ahead is a reasonable approach. However, there are other longer-term cycles at work which are superimposed on Girma’s model, as shown below, which could slightly change the result and of course we always need to beware of a possible ‘black swan’ event.
1410-1500 cold – Low Solar Activity(LSA?)-(Sporer minimum)
1510-1600 warm – High Solar Activity(HSA?)
1610-1700 cold – (LSA) (Maunder minimum)
1710-1800 warm – (HSA)
1810-1900 cold – (LSA) (Dalton minimum)
1910-2000 warm – (HSA)
2010-2100 (cold???) – (LSA???)

April 26, 2010 11:32 am

so I guess it looks like all this warming truely is MANNmade.

Gary Pearse
April 26, 2010 12:06 pm

Richard Telford (13:23:18) :
“Your model is very useful. Extrapolating back in time to the early part of the last ice age, your model predicts temperatures a below absolute zero. I knew the ice ages were cold, but that is perhaps a tad excessive. Predicting forwards, we can determine how long it will be before the oceans boil”
Your type of thinking is exactly what created the CAGW debacle – a gang of phycicists who jumped to conclusions on the basis of a 1.5 century temperature record. I don’t suppose you gave a thought to the idea that the trend in question may actually be quite useful for a time period of more immediate interest to existing humans for a few generations. Certainly such periods as the “warm periods” and cold periods that have occurred on multi-century scales – Roman Warm period, followed by a cool period to the middle ages when a new warm period arrived, only to be bent down into the little ice age which we are crawling out of still, suggests multi-century sinusoidal variations as well, and then the ice ages themselves a larger scale of variations. I’m sure it is useful to add these other undulatory trends in for other purposes and the “little sinusoidal variations” of this arcticle would be unnecessary for this long view…but probably still there!

toby
April 26, 2010 12:07 pm

The “oscillating anomaly” seems a Finagle’s Constant to me. What part of climate physics justifies it?
I modeled data similar to this (think it was GISS) with time-series software and got a satisfactory fit for a model with something like trend, autocorrelation (2) and moving average (1), plus a seasonality of 3 to five years. I assume the seasonalIty comes from ENSO. Unfortunately, I do not have my work to hand as I write.
Where are the metrics like AIC or BIC that shows this model is better than its competitors?
I intend redoing my own models and comparing with this one. Of course, I will be in touch with the author, as I am sure will others. For the moment, the jury is still out.

April 26, 2010 12:08 pm

wow

Mike Edwards
April 26, 2010 1:25 pm

I see no model here. This is just a mathematical fit to part of the historical data of a linear increase modified by a sinusoidal fluctuation. Fine as far as it goes, but it has no predictive power whatsoever.
A true model would explain what are the factors in the climate system which are causing both the linear increase and the sinusoidal fluctuation. Without having that explanation, who knows whether the same pattern will continue next week, let alone the next 90 years.
One thing we know pretty well for certain is that the linear trend does not continue back in time very far. So how is it possible to tell how much further it will continue into the future.

davidmhoffer
April 26, 2010 1:31 pm

Dave (10:47:35) :

I always find it funny when someone accuses the Intergovernmental Panel on *Climate Change* of only recently starting to shift its case towards Climate Change.>>
That’s your best defense? After decades of reports on AGW, CAGW, tipping points, stakes high, matters urgent, global catastrophe looming, immediate action required or we will all DIE FROM GLOBAL WARMING unless we destroy the economies of the first world and transfer their wealth to the third world to stop not just global warming but CATASTROPHIC GLOBAL WARMING, now all of a sudden that we find out that the data is corrupt, the studies fraudulent, and the trend flat or cooling despite record high CO2, NOW itz…
…well, hmph, itz about climate change, it was always about climate change, look itz even in our name, sniff… never mind what we said for 20 years, its in our name (haughty sneer), what do you think is important 20 years of what we said, or our name? How stupid are you if you can’t read our name?
Try again “Dave”. Come up with something more creative than pretending the past never happened.

Editor
April 26, 2010 1:34 pm

stevengoddard (07:07:31)

Ric Werme (05:26:49) :
Joe’s analysis plotted USHCN vs. CO2, not global temps. Global temps correlate much better.
There are four possibilities concerning the GISS/CRU vs. CO2 graphs
1. Coincidence. Not very likely. …

Steven, I’ll go over this again. Since the correlation of temperature with CO2 is no better than the correlation of temperature with a straight line, coincidence is not unlikely. For example:

There is no statistical difference between any of those correlations. Given that, coincidence is more than likely. It is probable.

April 26, 2010 2:47 pm

Willis Eschenbach (13:34:51) :
The linear relationship between temperature and CO2 goes through the entire GISS and CRU record, for 120 years, not back to 1959. I can assure you that the correlation between my age and the year is very poor going back to 1890, as is the case with most of the items in your graph.
The relationship between CO2 and temperature is well defined by physics, and the measured values over the last 120 years closely match what we would expect from non-feedback CO2 forcing.

davidmhoffer
April 26, 2010 4:27 pm

OK, I gotta ask.
Are “stevengoddard” and “steve goddard” two different people?
REPLY: No

Editor
April 26, 2010 4:46 pm

AusieDan (20:29:09) : “I suggest several minor changes:
Start in 1878 (at least notionally, although the first two years data is absent).
Use NCDC annual data, as this gives a much close fit.
Work with a 65 year cycle (half cycles 1878-1911, 1912-1943, 1944-1975, 1976-2008).
Use zigzags rather than cosine.

I did something like this last year, data starting in 1850, using Hadcrut3 and UAH monthly data.
http://members.westnet.com.au/jonas1/GlobalTemperature_PDOPhaseTrends.JPG
I did a multi-segment least-squares fit, starting with the approximate start and end dates of the PDO phases and optimising both vertically (temperature) and horizontally (date).
The final optimised dates were:-
– warming to 1878
– cooling to 1910
– warming to 1939
– cooling to 1976
– warming to 2003
The phases were not all of exactly equal duration. Given the small amount of data available after 2003, the last date may also be unreliable (not that any date is particularly reliable).
The overall trend, using two complete cycles 1878 to 2003, was +0.41 deg/century, or +0.53 deg/century using Hadcrut3 only. The difference might be from the Urban Heat effect.
http://members.westnet.com.au/jonas1/GlobalTemperature_CyclesAndTrend.jpg
—–
Ulric Lyons (03:26:10) : “We had SSN 70+ in February
Actually 18.6, according to
http://solarscience.msfc.nasa.gov/greenwch/spot_num.txt
On a daily basis, the max appeared to be 49 on 18 Feb
http://www.swpc.noaa.gov/ftpmenu/forecasts/SRS.html
http://www.swpc.noaa.gov/ftpdir/forecasts/SRS/0218SRS.txt

AC Adelaide
April 26, 2010 4:48 pm

Re :Mike Edwards (13:25:56) :
You say I see no model here….
I say : Is it better to have a “model” that fits the data with no explanation, or a “model” that has an explanation but doesnt fit the data? What is the probablity that the IPCC’s genuine models will fit the data in 90 years given that they didn’t even see out the current decade. Whats more Im not sure that the IPCC has much in the way of an explanation for past temperature excursions. Seems like some one should let go of their genuine model and go back to the drawing board.
On the other hand, I suggest that its pretty well accepted that anthropogenic CO2 causes catastrophic global warming. If the global surface temperatures are dropping now then clearly there must be a problem with the atmospheric CO2 measuring devices. These need to be urgently recalibrated to fit the surface temperature data in order to save the credibility of the IPCC models. The scope for re-calibrating the temperature data seems to passed.

April 26, 2010 5:05 pm

Dennis Nikols (22:25:58) :
. . . I submit the following discussion that I prepared for one of my essays.
“Occam (of Occam’s razor) expresses a principle; he is an English logician and Franciscan friar (William of Ockham, 1285-1349 CE). . . .

Unless CE stands for “Christian Era,” surely you meant A.D. 1285-1349 for a friar.

Editor
April 26, 2010 5:21 pm

How does the CO2 trend – assuming the AGW advocates’ claim of “constant CO2 (until evil mankind began raising it artificially) in the mid 50’s, then a rapid increase thereafter) fit the 30 year cyclical temperature trend?
Nothing but the PDO + AMO changes, or solar influence fit those up and down cycles….
A “least squares best fit” is merely ….. a straight line. It is nothing but a “naive guess” – as complained about by a equally critical observer earlier.

April 26, 2010 5:34 pm

Steven mosher (09:53:39) :
Exactly! The interesting thing to me is that the observed T/CO2 relationship shows little evidence of positive feedback, which is the cornerstone of Hansen’s belief system.
Without feedback, the temperature stays within 2C (the oft stated “safe temperature”) and thus the whole CAGW religion has no support from observation.
And with these sorts of discussions, we take the high road on the scientific side.

April 26, 2010 5:39 pm

Mike Jonas (16:46:12)
Date Flux SSN
2010 02 08 94 71
source: http://www.swpc.noaa.gov/ftpdir/latest/DSD.txt

April 26, 2010 5:58 pm

Mike Jonas (16:46:12)
That is the report number you are looking at, not the sunspot number!!
or are you reading 1049? that was a region.
http://www.swpc.noaa.gov/ftpdir/forecasts/SRS/0218SRS.txt
The SESC sunspot number definately was SSN 71 on the 8th Feb.

April 26, 2010 6:13 pm

I can`t find my way around the swpc files but you can see it also on the spaceweather archive: http://www.spaceweather.com/archive.php?view=1&day=09&month=02&year=2010 down the left hand column.

Roger Knights
April 26, 2010 7:27 pm

Dr. Roy Spencer will be interviewed tonight (Mon.) on Coast-to-Coast from 10pm to 2am Pacific time.

Editor
April 26, 2010 9:41 pm

Ulric Lyons : “The SESC sunspot number definately was SSN 71 on the 8th Feb.
You are quite right. My apologies. The site from which I source SSN, USAF/NOAA daily Sunspot Summary:
http://www.swpc.noaa.gov/ftpmenu/forecasts/SRS.html
unfortunately only gives the last 75 days, and 8 Feb has just dropped off the list. I do have the data archived, and for 9 Feb (it evidently differs from your source by a day) it was indeed 71:
1045 N23W17 256 0420 Fkc 20 35 Beta-Gamma-Delta
1046 N24E52 186 0030 Bxo 10 04 Beta
1047 S15E70 169 0010 Axx 02 02 Alpha
(10+35 + 10+04 + 10+02 = 71)
I have now checked everything again and found where I had treated multiple groups incorrectly for some past dates. Corrected graph posted here:
http://members.westnet.com.au/jonas1/SunspotGraph.JPG
By my calculations, the average SSNs for Jan-Mar 2010 were 21.5, 30.8, 24.2, but the NASA monthly SSN data at http://solarscience.msfc.nasa.gov/greenwch/spot_num.txt appears to me to show them as 13.1, 18.6, 15.4.
There is a comment on their website at
http://solarscience.msfc.nasa.gov/greenwch.shtml
Note that the data in the raw data files (gyyyy.txt) are uncorrected for the change in data source in 1977. The derived data [..] now ALL include the correction factor of 1.4x after 1976/12/31.
which shows how the data needs handling with care but doesn’t explain the discrepancy.

Pamela Gray
April 26, 2010 9:47 pm

Except that temps and SST’s correlate better than any of it. Add oceanic/atmospheric oscillations and ya got a winner! I believe it will eventually be possible to predict when a cold spell is overdue, much like earthquakes are predicted today. We don’t know what day, but we do know that one is coming.

Girma
April 26, 2010 9:59 pm

What we must not forget is that we need an answer.
1. IPCC claims a warming of 0.2 deg C per decade and acclerated warming [Figure 5]
2. Not us, but themselves in private say “slow changes over past decade” [REF 4] and “the level has really been quite stable since 2000 or so and 2008 doesn’t look too hot” [REF 5] (so in private they don’t agree with point 1)
3. The question each of us has to answer is that in the next 10 to 20 years, will the globe have further warming or cooling.
I bet anyone, based on my article, as the GMTA pattern was valid for 129 years, we will have global cooling in the next 10 to 20 years.

Girma
April 27, 2010 12:18 am

Cheering News:
The Australian Government shelfed its Emission Trading Scheme (tax on air).
That is a big win.
The plant food is not a pollutant, yet!

Sunshine
April 27, 2010 12:35 am

Very good article, being a “scientist” myself but in a realm far away from climatology or metorology I was able to follow the general theme and found it quite “rational”. I think that is going to be the problem with this presentation, it is to simple and logical for any politician etc. to consider plausible.
OT: I consider myself a skeptical skeptic when it comes to AGW, truth is I don’t know what to believe. I am certain that the debate is not over, and I am not convinced that CO2 (my science is Biochemistry) is in fact a factor in GW at all, certainly not enough to take on heroic efforts and costs associated with eliminating it. I am supportive of alternative energy etc and welcome the time when those are widely availble and affordable for everyone. What concerns me is the AGW supporters constant claims that NOT ONE CLIMATOLOGIST (Is that even a university degree?) stands on the side of the “deniers”……Is this true, or is it that in order to be titled “Climatologist” you have to believe in AGW?

Editor
April 27, 2010 12:45 am

stevengoddard (14:47:21)

Willis Eschenbach (13:34:51) :
The linear relationship between temperature and CO2 goes through the entire GISS and CRU record, for 120 years, not back to 1959. I can assure you that the correlation between my age and the year is very poor going back to 1890, as is the case with most of the items in your graph.

Not true. CO2 statistically does no better as an explanatory variable than a straight line since 1880. And it does no better than the US Postal Rates since 1880. The 120-year correlation of temperature with the cost of mailing a letter is just as good as the 120-year correlation of temperature with CO2.
You repeatedly claim that this correlation means something, but it doesn’t mean any more than the 120-year correlation of temperature with the distance to Alpha Centauri.

The relationship between CO2 and temperature is well defined by physics, and the measured values over the last 120 years closely match what we would expect from non-feedback CO2 forcing.

Absolutely not, that’s why there is huge debate about size of the “climate sensitivity”. The fact that CO2 is a greenhouse gas is well defined by physics. The effect of CO2 on temperature is what all the debate is about, and it is neither well defined nor well understood …
This “it’s just simple physics” argument is nonsense. See my post on “The Unbearable Complexity of Climate” if you don’t understand why that is so.
In addition, if the “CO2 drives temperature” theory were true, you’d expect the logarithm of CO2 to correlate better with temperature than does the raw CO2 level. But this is not the case. Over the 120-year period, the correlation of the two are within 0.01 of each other. Perhaps you can explain that, as it directly contradicts your theory.
You’ll be glad to know, however, that the best correlation in the bunch is the 120-year correlation between CO2 and US Postal Rates …

Brent Hargreaves
April 27, 2010 3:16 am

John A. (23:30:34) wrote: “I think the temptation to think that “the future is a continuation of the present trends” is at the root of the global warming scare and pretty much every apocalyptic movement there has ever been.”
John, you are bang on target.
The brainless extrapolation of a trend without seeking to identify the processes behind it is to be avoided like the plague. Before Kepler and Newton, one might reasonably extrapolate the Earth’s orbit over a cherrypicked timescale (say, a week) and claim that we were headed for Alpha Centauri. Without the underlying physics, it’s just numerology.
This gentleman claims, having identified a 129-year trend, that it’s reasonable to suppose it will continue another 20. Sorry, this is not science; it’s no better than IPCC numerology.

MartinGAtkins
April 27, 2010 3:16 am

stevengoddard (09:31:38) :
You came up with the same numbers I did. At 550 PPM, the temperature increase is 1.5C above the present.
If we use the figures we’re using now:-
1959/1-2009/11 = 621 months.
CO2 1958/3 = 316ppm, 2009/11 = 386ppm. So 386 – 316=70ppm.
70ppm / 621 months = 0.11272142ppm per month.
Now if we assume CO2 constant growth @ 0.11272142ppm per month
and we calculate from 2009/11 at the present 386ppm and we get.
550 – 386 = 164
164 / 0.11272142 = 1455 months or 211.25 years to reach 550ppm.
Checking the 1.5c figure (not that I don’t trust you) and using my figure of 0.00928571C per 1 ppm CO2 I have.
1.5 / 0.00928571 = 161.53853609ppm
So my temp rise per 1ppm Co2 is a little higher than yours.
1.5 / 0.00914671 = 163.99339216ppm
The difference of 0.000139C per 1 ppm is not statisticly significant so our findings are robust.
We can say by the year 2221 it’s gonna be 1.5C warmer.
I’ll alert the media.

Spector
April 27, 2010 3:20 am

It is my personal opinion that correlation is a poor choice of indicator to form the basis for a scientific theory because this is, at best, only circumstantial evidence. I also agree that our general climate dynamics may be so complex that no elegant solution or theory will ever be possible.
My suggestion would be to find the simplest possible environments, such as a cloudless tropical desert, or cloudless arctic night and calculate the expected greenhouse effects for these conditions and compare the predictions with results observed over the years.
I do remember the purple three cent Jefferson stamps and the 19-cent hamburgers.
Reply: I remember 10 cent hamburgers and 10 cent cokes and I’m only 52. Upon doing research either it was a special, or I have a false memory. ~ ctm

April 27, 2010 4:38 am

Willis Eschenbach (00:45:33) :
dT/dt and dCO2/dt are not constants (linear slope) as you keep insisting.
https://spreadsheets.google.com/oimg?key=0AnKz9p_7fMvBdE9rZ3lzMHRRaGxUb3JHRXZfU0daeWc&oid=2&v=1272367154522
The rate of increase of CO2 has increased sharply since the 1950s, just as the rate of increase in my age has increased sharply since the 1950s. Furthermore, the relationships you chose (like population growth) are directly tied to the amount of CO2 in the atmosphere. If there are more people, there are more cars. They are not the independent relationships you claim them to be.
What is a constant is the ratio of dT/dCO2.
https://docs.google.com/Doc?docid=0AXKz9p_7fMvBZGR3ODJ3d3NfNjE2Yzdxc2MzZ20&hl=en
We expect from physics that temperature will increase with CO2. I don’t think you will find a serious atmospheric physicist (including Spencer or Lindzen) who believes otherwise. Spencer and Schmidt have both stated that they expect to see a doubling of CO2 produce a 1.2 degree increase in temperature, similar to what observations and Stefan-Boltzman predict.
https://docs.google.com/Doc?docid=0AXKz9p_7fMvBZGR3ODJ3d3NfNjE2Yzdxc2MzZ20&hl=en

April 27, 2010 4:43 am

MartinGAtkins (03:16:31) :
You provided a constant of 0.00928571 C / PPM
(550-390) * 0.00928571 = 1.49
Your formula predicts a further increase in temperature of ~1.5C going from 390 PPM to 550 PPM, just as my plot and physics does.
https://docs.google.com/Doc?docid=0AXKz9p_7fMvBZGR3ODJ3d3NfNjE2Yzdxc2MzZ20&hl=en

April 27, 2010 4:45 am

Meant to say :
“Spencer and Schmidt have both stated that they expect to see a (non-feedback) doubling of CO2 produce a 1.2 degree increase in temperature”

April 27, 2010 4:55 am

Girma (21:59:12) :
The low end IPCC estimates predict a warming of 2C from “pre-industrial levels” which is much less than a “0.2C/increase per decade.”
The problem with the IPCC report is that it predicts a “most likely” 3C, and high end 6C – which both appear to be too high.

Chris Wright
April 27, 2010 5:01 am

stevengoddard (07:00:14) :
Chris Wright (03:24:11) :
“There are no tricks in the plots. They are straight up Temperature vs. CO2 and show good correlation. The fact that you don’t like the results doesn’t invalidate them.”
The first argument against these graphs is that you could demonstrate equally good correlation with lots of random factors, but Willis got in there first. Bearing in mind that global temperatures have been increasing since the end of the LIA, it’s hardly surprising that the temperature shows an upwards trend, just as CO2 does.
My original point was that there is no filtering, so the points tend to fill a wide band that looks linear. Of course, the standard plots of temperature against time show that for much of the time temperatures are going down while CO2 goes up.
Some time ago I reproduced those graphs, but also added some filtering as follows:
1. No filtering:
http://www.kline.demon.co.uk/AANoFiltering.jpg
It looks similar to the plots you showed. Blue line is CO2, red is temperature against time (I think it was CRUTEMP, I did the plots a long time ago).
2. One year rolling average
http://www.kline.demon.co.uk/AAOneYearFilter.jpg
With filtering it now looks very similar to CRUTEMP (of course, the horizontal scales are different, the upper one is plotter against CO2, not time). That’s what you would expect, as the CO2 graph is, very approximately, a straight line.
When you remove the noise caused by short-term variation, what seemed to be vaguely a straight line is now obviously not. The lack of warming over nearly the last decade shows a distinct lack of correlation. The Climategate emails show a prominent team member agonising over the lack of warming. Doesn’t sound like high correlation to me.
The reason why I don’t like those plots is that no filtering was used. Filtering is a standard procedure in climate science because it reduces the noise and makes trends easier to see. With this kind of plot the noise makes it look like a straight line, which is clearly misleading.
So, an obvious question: why was no filtering used? I think I know what the answer to that is, but what do you think?
Chris

April 27, 2010 5:06 am

Dr Orssengo – a careful, thoughtful and well presented analysis from a new perspective – thank you sir.
Its best quality is the very good fit with reality for recent decades and its greatest value is the projection for the next five to 10 years.
We are at or near a ‘tipping point’ in public opinion on climate change in the US and getting close in the UK and other countries. High quality input like yours will influence many thousands of thinking people everywhere and hasten the mass shift away from the AGW mania that has gripped the world for over 20 years.
I don’t think we will have to wait 10 years to see the benefit of your work – another couple of winters like the last one in Europe and the USA and we will see a seriously large majority of people who no longer believe in AGW or greenhouse gasses. Soon the sheer weight of public opinion will put Messrs Gore, Pachauri, et al, firmly on the back foot. Next the important work of demolishing the whole Carbon edifice – low Carbon energy, Carbon footprints, the Carbon Trust, Cap and Trade, Carbon offsets, Carbon emission permits, etc., etc., can get going. The proponents are well entrenched and well funded (with our tax Dollars) so it will be an epic struggle – but thanks to people like you, Anthony and the great army of like minds, this is now, at last, a winnable fight.
Thank you again.
d marloil

April 27, 2010 5:20 am

Chris Wright (05:01:04) :
Try scaling your CO2 plot by 0.5 along the Y-axis and see what happens.
https://spreadsheets.google.com/oimg?key=0AnKz9p_7fMvBdE9rZ3lzMHRRaGxUb3JHRXZfU0daeWc&oid=2&v=1272370776872
You are plotting the same data as me, and there is no reason to expect different results.

April 27, 2010 5:38 am

Chris Wright (05:01:04) :
Interesting to hear people here complaining about plots of raw data.
I keep wishing that NCDC, USHCN and GISS would make plots of raw data readily available.

April 27, 2010 5:51 am

stevengoddard (05:38:25) :
“I keep wishing that NCDC, USHCN and GISS would make plots of raw data readily available.”
When GISS posts what they say is “raw data,” why does it change from one month to another? click

April 27, 2010 6:08 am

Smokey (05:51:40) :
Looks to me like you have mixed raw data with homogeneity adjusted data in your blink graphs. The current Decatur raw plot looks just like your 2006 raw plot
http://data.giss.nasa.gov/cgi-bin/gistemp/gistemp_station.py?id=425745600020&data_set=0&num_neighbors=1

April 27, 2010 6:09 am

Smokey (05:51:40) :
Also, GISS doesn’t provide graphs of raw station data. Their graphs all include USHCN adjustments.

April 27, 2010 6:19 am

stevengoddard (06:09:57),
Do you know why it is presented as raw data? And why the chart data history changes from one month to another?
Here’s another example: click

Girma
April 27, 2010 6:57 am

Gregory
Why I chose the cosine fine function was because when I detrend (remove the trend) from the observed GMTA data, I got the following oscillating anomaly that can be approximated by a cosine function.
http://www.woodfortrees.org/plot/hadcrut3vgl/from:1880/detrend:0.706/offset:0.52/compress:12
Gregory, I am not sure we can use my model for more than 20 or 30 years. We had only 129 years of data. My main motivation was that IPCC’s prediction of 0.2 deg C per decade for this and the next decade had failed, and what I am trying to answer is to find whether we will have further warming or cooling in the next 20 years.
Thank you.
Girma

April 27, 2010 7:01 am

Smokey (06:19:27) :
They say “raw GHCN data+USHCN corrections”
http://data.giss.nasa.gov/gistemp/station_data/

Editor
April 27, 2010 7:40 am

Sunshine (00:35:01) :
Very good article, being a “scientist” myself but in a realm far away from climatology or metorology I was able to follow the general theme and found it quite “rational”. … What concerns me is the AGW supporters constant claims that NOT ONE CLIMATOLOGIST (Is that even a university degree?) stands on the side of the “deniers”……Is this true, or is it that in order to be titled “Climatologist” you have to believe in AGW?
—…—…—
An absolutely false claim.
Anyone who makes it is simply and blatantly lying. Further, more than 33,000 multi-degreed scientists, statisticians, programmer, weathermen, engineers, and – yes – even climatologists – have signed a petition verifying their background and publicly declaring that the recent global warming is neither “at a tipping point”, catastrophic, nor entirely man-made by CO2 releases. So your observation is shared by many tens of thousands of others. Despite the alarmists’ press releases.)
I will admit that EVERY climatologist and environmental activist and liberal politician whose work will be funded by past, present and future UN and IPCC tax money DOES stridently believe that his or her future paycheck will be linked to man-made global warming …

April 27, 2010 7:52 am

RACookPE1978 (07:40:50) :
I think you will be hard pressed to find a climatologist who does not believe that man has affected the climate, including Pielke(s), Spencer and Lindzen. The primary disagreement is quantitative wrt to CO2 sensitivity.

Wren
April 27, 2010 7:59 am

RACookPE1978 (07:40:50) :
Sunshine (00:35:01) :
Very good article, being a “scientist” myself but in a realm far away from climatology or metorology I was able to follow the general theme and found it quite “rational”. … What concerns me is the AGW supporters constant claims that NOT ONE CLIMATOLOGIST (Is that even a university degree?) stands on the side of the “deniers”……Is this true, or is it that in order to be titled “Climatologist” you have to believe in AGW?
—…—…—
An absolutely false claim.
Anyone who makes it is simply and blatantly lying. Further, more than 33,000 multi-degreed scientists, statisticians, programmer, weathermen, engineers, and – yes – even climatologists – have signed a petition verifying their background and publicly declaring that the recent global warming is neither “at a tipping point”, catastrophic, nor entirely man-made by CO2 releases. So your observation is shared by many tens of thousands of others. Despite the alarmists’ press releases.)
=======
Who wouldn’t agree the recent global warming is neither “at a tipping point”, catastrophic, nor entirely man-made by CO2?
But the concern is about the future.

April 27, 2010 8:08 am

Girma (06:57:18) :
Your observations of oscillation around the mean trend are quite accurate, and it would be great if someone could quantify the root physical causes.
Nevertheless, the long term trend is upwards, with those oscillations ultimately proving to be little more than noise.

April 27, 2010 8:36 am

Mike Jonas (21:41:44) :
“I do have the data archived, and for 9 Feb (it evidently differs from your source by a day) it was indeed 71:”
That is the same as the spaceweather site, they released the count for the 8th, on the 9th, makes sense yes? A spectacular level of x-ray events on the 8th too.

Agile Aspect
April 27, 2010 9:07 am

stevengoddard (13:32:13) :
“we should see warming close to the non-feedback response of Stefan–Boltzmann”
But that’s only if the calculation is done incorrectly, The physical temperature is and not , i.e,, the thermometer measures T not T^1/4, and you need Holder’s identity to evaluate it. When is calculated correctly, the temperature is negative.
I’ll ignore the fact Stefan-Boltzmann can’t be applied to the upper atmosphere since there is no surface up there – but that’s another can of worms and one actually has to go through the derivation Stefan-Boltzmann to understand why.

Agile Aspect
April 27, 2010 9:32 am

stevengoddard (13:32:13) :
“we should see warming close to the non-feedback response of Stefan–Boltzmann”
(It looks the expectation operator is not being displayed deleting the what’s in the parentheses.)
But that’s only if the calculation is done incorrectly, The physical temperature is the expectation of T and not the expectation of T^1/4, i.e,, the thermometer measures T not T^1/4, and you need Holder’s identity to evaluate it. When expectation of T is calculated correctly, the temperature is negative.
I’ll ignore the fact Stefan-Boltzmann can’t be applied to the upper atmosphere since there is no surface up there – but that’s another can of worms and one actually has to go through the derivation Stefan-Boltzmann to understand why.

April 27, 2010 10:14 am

Agile Aspect says:
Re : “When expectation of T is calculated correctly, the temperature is negative.”
Sounds like you have calculated that greenhouse gases cool the atmosphere. Interesting theory.

Editor
April 27, 2010 12:44 pm

stevengoddard says:

April 27, 2010 at 4:38 am
Willis Eschenbach (00:45:33) :
dT/dt and dCO2/dt are not constants (linear slope) as you keep insisting.

I ask again, and will continue to ask. If you disagree with something I have said, QOUTE IT. I never said that dT/dt or dCO2/dt are constants.

The rate of increase of CO2 has increased sharply since the 1950s, just as the rate of increase in my age has increased sharply since the 1950s. Furthermore, the relationships you chose (like population growth) are directly tied to the amount of CO2 in the atmosphere. If there are more people, there are more cars. They are not the independent relationships you claim them to be.

So your claim is that the distance to Alpha Centauri and the cost of US stamps are “directly tied to the amount of CO2 in the atmosphere”??

We expect from physics that temperature will increase with CO2. I don’t think you will find a serious atmospheric physicist (including Spencer or Lindzen) who believes otherwise. Spencer and Schmidt have both stated that they expect to see a doubling of CO2 produce a 1.2 degree increase in temperature, similar to what observations and Stefan-Boltzman predict.

No. We expect from physics that forcing will increase, not with CO2 as you claim, but with log(CO2).
However, that is a very different question from whether temperature will increase from that. CO2 is not the only thing affecting the temperature. And as I pointed out above, log(CO2) does not correlate with temperature any better than CO2 does … I asked you to explain that, you might not have noticed.

April 27, 2010 2:10 pm

Willis, try plotting a log function. As you move towards the right, it moves asymptotically towards linear.
https://spreadsheets.google.com/oimg?key=0AnKz9p_7fMvBdE9rZ3lzMHRRaGxUb3JHRXZfU0daeWc&oid=3&v=1272402437835
The knee of the T vs CO2 graph is at about 20-30 PPM. 390 PPM is way out in the “almost linear” portion of the curve. Linear is a good estimate.
Your attempts to dissociate temperature from CO2 are futile. It is physical law.

phlogiston
April 27, 2010 2:28 pm

stevengoddard is a troll pretending to be Steve Goddard.
He sounds a bit like a naive version of Joel Shore – maybe Joel has a bright teenage son?
“stevengoddard”: “Your attempts to dissociate temperature from CO2 are futile. It is physical law.”
How about this physical law. More intense sunlight (all frequencies) hitting an object with the same geometry, will make that object hotter. I’m sure both Stephan and Bolzman would be happy with that. So the sun has increased its radiation output by about 25 % over the last 3 billion odd years. So physical law dictates that the earth got much hotter over that time.
Only it hasnt. Why?

April 27, 2010 2:42 pm

phlogiston says:
April 27, 2010 at 2:28 pm
So the sun has increased its radiation output by about 25 % over the last 3 billion odd years. So physical law dictates that the earth got much hotter over that time.
Only it hasnt. Why?

You know the answer to that one: “much less CO2 now than 3 billion years ago”, right?

Editor
April 27, 2010 3:03 pm

stevengoddard says:
April 27, 2010 at 2:10 pm

Willis, try plotting a log function. As you move towards the right, it moves asymptotically towards linear. The knee of the T vs CO2 graph is at about 20-30 PPM. 390 PPM is way out in the “almost linear” portion of the curve. Linear is a good estimate.

Steven, you are proving my point … the correlation of CO2 with temperature is no better than the correlation of log(CO2 with temperature. Log(CO2), as you point out, is nearly a straight line. Give it up, you can’t prove anything by a correlation.

Your attempts to dissociate temperature from CO2 are futile. It is physical law.

Physical law? Like the Second Law of Thermodynamics, or the Law of Gravity, or Boyle’s Law, or Newtons Laws Of Motion? Funny, I’ve never heard of the “Law of CO2”.
CO2 is a greenhouse gas. It increases forcing. But with a complex system, that can mean lots … or nothing at all. For example, the biggest forcing is solar. And we know that if we put something in the direct sun, very soon it warms down to the core. Simple physics, a “physical law”, in your terms.
So when I go out in the sun, according to your “physical law”, my core temperature should go up … but guess what? In complex systems, simple physics is a security blanket for simple analysts, but has no application in the real world.
If you want an applicable law, you should study up on the Constructal Law. Unlike your pseudo-law, the Constructal Law governs complex flow systems … like say the climate

MartinGAtkins
April 27, 2010 3:04 pm

stevengoddard says:
You provided a constant of 0.00928571 C / PPM
(550-390) * 0.00928571 = 1.49

My base was 2009/11 @ 386 so (550-386) * 0.00928571 = 1.52
390 is ok though as i think that is closer to the latest data.
My problem is not with your results but with the lack of detail given for anyone assessing the result. For instance CO2 constant growth @ 0.11272142pm per month means the 160 you need to get to 550 will take 1419 months or 118 years. Not giving a reference for a time indicator, renders your graph meaningless.
Using the latest CO2 data I’ve calculated that 0.1196ppm per month is the average growth so 160 / 0.1196 = 1337 or 111 years.

April 27, 2010 3:11 pm

Willis,
CO2 has varied by 50% over the last 120 years. TSI has only varied by a percent or so during that period.
phlogiston ,
I have never disputed the fundamental relationship between CO2 and temperature. My dispute is with people using climate models that produce large amounts of positive feedback, which is not evident in temperature record. Without being able to model clouds or ocean circulation accurately, they are for all intents and purposes – useless.

Editor
April 27, 2010 3:41 pm

stevengoddard said on Predictions Of Global Mean Temperatures & IPCC Projections
April 25, 2010 at 11:58 am

Willis,
CO2 has varied by 50% over the last 120 years. TSI has only varied by a percent or so during that period.

Yes, I know that … and?
I ask again. If you are objecting to something that I have said, QUOTE MY WORDS.
Here once again you are objecting to some fantasy you have had. In this case, your fantasy seems to be that I am saying that the cause of the warming is solar … dude, I don’t know how to break this to you gently, but your reading comprehension skills need lots of work. I said nothing about solar causing warming, read what I wrote.

kwik
April 27, 2010 3:44 pm

Here is a low frequency curve for you guys to add to the formula;
http://www.phys.huji.ac.il/~shaviv/Ice-ages/GSAToday.pdf

April 27, 2010 3:54 pm

Willis,
This conversation is getting extremely dull …..
https://docs.google.com/Doc?docid=0AXKz9p_7fMvBZGR3ODJ3d3NfNjE2Yzdxc2MzZ20&hl=en

sky
April 27, 2010 4:08 pm

Willis,
Thanks for showing the correlation of postal rates with CO2. I’ve never really trusted USPS and now have “evidence” to support my feelings.

George E. Smith
April 27, 2010 5:58 pm

“”” The Stefan–Boltzmann law, also known as Stefan’s law, states that the total energy radiated per unit surface area of a black body in unit time (known variously as the black-body irradiance, energy flux density, radiant flux, or the emissive power), j*, is directly proportional to the fourth power of the black body’s thermodynamic temperature T (also called absolute temperature): “””
The above was lifted verbatim from a Wikipedia article cited by Steve Goddard. It’s a perfect example of why people say wiki is trash.
In this case the most obvious trash is this:- “”known variously as the black-body irradiance, energy flux density, radiant flux “”
Well no it isn’t any one of those things.
The quantity given by the Stefan-Boltzmann Law; that equates to sigma.T^4 is the RADIANT EMITTANCE, which could be shortened to simply EMITTANCE.
The word RADIANT is intended to denote the appropriate measures of quantities given in terms of WATTS, which of course is a POWER”or rate of ENERGY or of WORK, in the discipline generally known as RADIOMETRY. This distinguishes it from the related discipline of PHOTOMETRY which relates solely to quantities that are registered by the human eye; where the appropriate substitute for WATTS is LUMENS.
So the term RADIANT can be omitted where it is obvious that we are referring to the appropriate quantities in terms of WATTS.
So what is wrong with that term “black-body irradiance” ?
Well there most certainly IS such a quantity as IRRADIANCE, and even black-body IRRADIANCE; but the problem is that IRRADIANCE is a measure of INCOMING, not OUTGOING which is what EMITTANCE is.
In fact IRRADIANCE is the exact opposite of EMITTANCE; and both are expressed in WATTS per METRE SQUARED.
So a black body may EMIT so many W/m^2 as its EMITTANCE; but what a body or surface RECEIVES in W/m^2 is its IRRADIANCE. In Photometry, the “light” received by an ILLUMINATED surface would be its ILLUMINANCE in Lumens per square metre.
Now the RADIATION EMITTED from a black body; or any body for that matter, is generally considered to be emitted into a full hemisphere; or 2pi STERADIANS of space, when talking about EMITTANCE, since in the limit, this is the contribution of any small element of the surface.
One then is led to enquire about the ANGULAR DISTRIBUTION of the emitted radiation from a surface element, which means examining the EMITTANCE per STERADIAN emitted in some particular direction; usually relative to the normal to the surface element. For this quantity; which now has the units of WATTS per SQUARE METRE per STERADIAN, we have a new name which is RADIANCE, and its photometric equivalent would be LUMINANCE in Lumens per square metre per steradian or if you prefer; Lumens per steradian per square metre; which is the same.
From a large distance, an Emitter starts to look like a point source, so we start to thinkof the WATTS per STERADIAN as a measure of the source. This quantity also has a name which is RADIANT INTENSITY or simply INTENSITY, and it no longer contains any per square metre. The luminous equivalent is Luminous Intensity which is Lumens per steradian and the unit is called the CANDELA; which is a fancy name for what we used to call “Candle power.”
A real black body surface happens to have a very specific Radiation angular distribution pattern for its INTENSITY given by:-
I(theta) = I0.Cos (theta), and if we remember that the source does have an actual area (A), which when viewed at an angle theta, has a projected area A(theta) = A0 Cos(theta).
So the RADIANCE of our (black body) source seen at an angle theta off the normal , is given by :-
I0.Cos (theta) / A0.Cos(theta) = I0 / A0 which we can see is constant in all directions.
Such a source which has constant Radiance when viewed in any direction, is referred to as a LAMBERTIAN source; and we say that its Intensity follows LAMBERT’S Cosine rule; I(theta) = I0.Cos(theta).
For a Lambertian source such as a real black body emitter, having an axial Intensity of I0, the Total Power radiated in all directions (2pi steradians) turns out to be simply pi.I0.
The disciplines of radiometry and photometry ought really be referred to as the undisciplines; as it is about the most screwed up area of physics; particularly when it comes to Photometry.
The quantities “Radiance” and “Luminance” are quite often very loosely referred to as the “Brightness” of the source; and there is the rub. “Brightness” has a very generic colloquial meaning to ordinary people in ordinary usage; but if being used loosely in photometry, and perish the thought ultra loosely in radiometry, it has a very specific meaning; namely the Luminance or Radiance; and no other quantity; which is why its usage is greatly to be discouraged.
And Wiki as we have seen stumbled right away on that one.

Girma
April 27, 2010 7:09 pm

To all who criticised me I am fitting a curve to data, please look at the following sinusoidal pattern of the observed GMTA after it is detrended (trend removed):
Oscillating Anomaly
We can not ignore what you see!

April 27, 2010 9:57 pm

Girma says:
April 27, 2010 at 7:09 pm
To all who criticised me I am fitting a curve to data, please look at the following sinusoidal pattern of the observed GMTA after it is detrended (trend removed):
Oscillating Anomaly
We can not ignore what you see!

Actually you should, because one apparent cycle is inadequate information on which to base a model. You certainly can’t use it for extrapolation as it has no physical underpinning. You need several more cycles before you have any evidence of sinusoidal behavior. Why not fit a quartic to it, that should fit just as well?

Girma
April 27, 2010 11:36 pm

Phil; April 27, 2010 at 9:57 pm
You wrote: You need several more cycles before you have any evidence of sinusoidal behavior. Why not fit a quartic to it, that should fit just as well?
We only have two-cycles of data and a quartic function does not have a period and an amplitude.
If you have two full cycles, is an assumption for a third one that implausible?

phlogiston
April 28, 2010 12:57 am

Leif Svalgaard
Steve Goddard
April 27, 2010 at 2:42 pm
” phlogiston says:
April 27, 2010 at 2:28 pm
So the sun has increased its radiation output by about 25 % over the last 3 billion odd years. So physical law dictates that the earth got much hotter over that time.
Only it hasnt. Why?
You know the answer to that one: “much less CO2 now than 3 billion years ago”, right?”
This is the standard AGW response to the faint sun paradox. But the CO2 decline over earth history has not quite been linear, much more irregular than the steady change in solar output. (You yourself have previously rubbished posters for looking for causative interaction between one smoothly varying parameter and another widely fluctuating one). Relative to the earths age the Cretaceous is very recent, solar output would only be 1% or so less than now – but sky high CO2 and no catastrophe.
Then there’e the tricky chicken-and-egg question, does CO2 cause or respond to temperature change? AGWers point to the Stephan-Bolzman law to argue for simple causation and are intolerant of the slightest blasphemy against it, but go to ludicrous contortions to try to wriggle out of Henry’s law (and associated van t’Hof equations) that dictate that CO2 will outgass from the oceans in response to increasing water temperature. So the SB law argues for active CO2 and Henry’s law for passive. You cant pick and choose your laws, both are correct.
You are also fond of criticizing people (e.g. Stephen Wilde) for not proposing mechanisms for a proposed scenerio. By what mechanism has CO2 adjusted relative to solar output to provide a stable climate suitable for life for about 4 billion years? Are you proposing we elevate Lovelock’s Gaia (daisyworld) hypothesis to the level of SB and Henry’s law?

Chris Wright
April 28, 2010 2:24 am

stevengoddard says:
April 27, 2010 at 5:38 am
Chris Wright (05:01:04) :
“Interesting to hear people here complaining about plots of raw data.
I keep wishing that NCDC, USHCN and GISS would make plots of raw data readily available.”
That’s a pretty strange comment. Are you saying it’s wrong to plot filtered data? True, sometimes filters may be used to mislead people, but in general applying a 12 month rolling average is perfectly standard and acceptable. And I showed both the filtered and non-filtered versions.
I think you’re confusing ‘adjustments’ to the raw data with filters used to clean up the noise in the graph. The original data was in no way ‘adjusted’ in my plots, I simply used the filter to reduce the noise, which is a perfectly normal procedure. And when the noise is reduced you can see that the result is quite non-linear, with temperature and CO2 moving in different directions for a significant part of the plot.
Of course, adjustments to the actual original data are a huge issue in climate science. For example, it seems that much of the 20th century warming comes from adjustments rather than the raw data. I certainly agree that all the raw data should be published and made publicly available.
Chris

April 28, 2010 5:39 am

phlogiston says:
April 28, 2010 at 12:57 am
But the CO2 decline over earth history has not quite been linear, much more irregular than the steady change in solar output.
The temperature has also varied very irregularly, sometimes being 10C more and 10C colder than today. The distribution of land/sea has varied a lot, so ocean currents have varied. But there are enough negative feedback in the climate system to ensure that there is no catastrophes. On the other hand, in about a billion years when the Sun’s luminosity is 10% greater than today, temperatures will rise. Due to increased weathering of rocks the CO2 content will continue to decrease, and plants will die [when CO2 falls below some 150 ppm]. The oceans will be lost and only microbes will be left alive, until eventually the Sun gets too hot and all life will go extinct. So, the climate will not remain stable.

April 28, 2010 6:17 am

phlogiston says:
April 28, 2010 at 12:57 am
By what mechanism has CO2 adjusted relative to solar output to provide a stable climate suitable for life for about 4 billion years?
You have this backwards. It is life that has adjusted to changing climate and CO2, see e.g. http://www.leif.org/EOS/bg-3-85-2006.pdf

Stephen Wilde
April 28, 2010 8:07 am

“On the other hand, in about a billion years when the Sun’s luminosity is 10% greater than today, temperatures will rise.” (Leif Svalgaard).
Why so ?
It’s up some 30% from a few billion years ago but temperature much the same as per the so called faint sun paradox.

April 28, 2010 8:37 am

Stephen Wilde says:
April 28, 2010 at 8:07 am
It’s up some 30% from a few billion years ago but temperature much the same as per the so called faint sun paradox.
http://www.leif.org/EOS/bg-3-85-2006.pdf explains why. Figure 2 shows that the temperature was not the same. T has decreased from near boiling 3.5 Gyr ago and will again increase to that point in another billion years. The popular myth [oh so many of those] is that the presence of life shows that T must have been nearly constant. This is false. Life has evolved and adapted to T. In the beginning microbal life liked high T, and when high T returns, life will again become microbal. We live now in a comfortable [for us] window of about 1 Gyr [we are in the middle] where our kind of [complex] life is possible.

phlogiston
April 28, 2010 9:40 am

Leif Svalgaard says:
April 28, 2010 at 5:39 am
“Due to increased weathering of rocks the CO2 content will continue to decrease, and plants will die [when CO2 falls below some 150 ppm]. The oceans will be lost and only microbes will be left alive, until eventually the Sun gets too hot and all life will go extinct.”
Still – no worries, eh?

April 28, 2010 9:40 am

Girma says:
April 27, 2010 at 11:36 pm
Phil; April 27, 2010 at 9:57 pm
You wrote: You need several more cycles before you have any evidence of sinusoidal behavior. Why not fit a quartic to it, that should fit just as well?
We only have two-cycles of data and a quartic function does not have a period and an amplitude.

Which is an assumption on your part, you’ve fitted a curve with an assumption that it should be sinusoidal and therefore must have a period and amplitude. However a quartic will fit your data just as well, that’s the problem with picking a random function and fitting it without an underlying physical mechanism. Your ‘model’ predicts that temperature will go down in future because you chose a function based on your assumption that the temperature will go down in the future. There’s no basis for that assumption.
If you have two full cycles, is an assumption for a third one that implausible?

phlogiston
April 28, 2010 9:45 am

Leif Svalgaard says:
April 28, 2010 at 6:17 am
Very interesting (if somewhat depressing) potted history of life by Franck et al 2006, thanks for the link.
Looks like everything changed at the Cambrian. Up to then, CO2 and temperature go in the same direction – after the Cambrian they diverge (on the Gyr timescale). My quick take on this would be that, pre-Cambrian with reducing atmosphere and bare dry rocky land surface, CO2 could move temperatures. But in the multicellular life epoch post Cambrian you have a more complex and dynamic hydrologial cycle, more rain and clouds. With the increased complexity more negative feedbacks also. CO2 is pushed to the side as a climate / temperature forcer.

April 28, 2010 10:09 am

phlogiston says:
April 28, 2010 at 9:45 am
With the increased complexity more negative feedbacks also. CO2 is pushed to the side as a climate / temperature forcer.
As the temperature eventually increases at lot, H2O becomes the greenhouse gas instead, combined with the death of plants [and their negative feedback], so we basically revert to microbial life again before it all ends.

April 28, 2010 10:22 am

Chris Wright,
Why filter the data? The unfiltered data is perfectly clear.
http://docs.google.com/View?id=ddw82wws_616c7qsc3gm

April 28, 2010 10:24 am

phlogiston
There was an ice age during the Ordovician with CO2 levels 10X present. That would indicate that something else is driving the large swings in temperature associated with ice ages.

April 28, 2010 10:41 am

stevengoddard says:
April 28, 2010 at 10:24 am
There was an ice age during the Ordovician with CO2 levels 10X present. That would indicate that something else is driving the large swings in temperature associated with ice ages.
Of course, solar insolation and land/sea distribution. Why bring up this as a problem?

April 28, 2010 11:29 am

Leif,
The vast majority of land is currently located at high latitudes, and CO2 levels are near Phanerozoic lows. Why aren’t we in an ice age?

Girma
April 28, 2010 11:50 am

Phil (April 28, 2010 at 9:40 am)
You wrote, “…Which is an assumption on your part, you’ve fitted a curve with an assumption that it should be sinusoidal and therefore must have a period and amplitude. However a quartic will fit your data just as well, that’s the problem with picking a random function and fitting it without an underlying physical mechanism.
Please show me your fit of a quartic to this oscillating anomaly?
OSCILLATING ANOMLAY
Please do!

April 28, 2010 12:03 pm

stevengoddard says:
The vast majority of land is currently located at high latitudes, and CO2 levels are near Phanerozoic lows. Why aren’t we in an ice age?
But we are! The current ice age began several million years ago. An ice age consists of many separate glaciations with brief warmer interglacial periods. We are in such an interglacial right now [actually nearing the end of it] and temperatures are already dropping on its way into the next glaciation some 50,000 years from now.

phlogiston
April 28, 2010 1:41 pm

stevengoddard says:
April 28, 2010 at 10:24 am
phlogiston
“There was an ice age during the Ordovician with CO2 levels 10X present. That would indicate that something else is driving the large swings in temperature associated with ice ages.”
It would indeed. CO2 atmospheric levels were apparently 8-20 times the present levels during the Ordovician. And the era ended with a severe ice age with glaciers over the present day Sahara. This is the opposite of the run-away warming that C-AGW would predict. Another reason why the Ordovician is problematic for AGW (as if this were not enough) is corals. Current AGW theory holds that increasing CO2 atmospheric levels are acidifying the ocean and causing stress to corals. But in the Ordovician with much higher airborne CO2, corals did – again – the opposite of the AGW prediction – instead of going extinct, they evolved and spread widely and successfully.

phlogiston
April 28, 2010 2:10 pm

Leif Svalgaard says:
April 28, 2010 at 8:37 am
“This is false. Life has evolved and adapted to T. In the beginning microbal life liked high T, and when high T returns, life will again become microbal.”
Fig 2b is certainly a thought-provoking and useful summary of global temperatures over earth’s lifetime. Your above summary skates over the Cambrian. The sharp drop in temperatures at the Cambrian explosion could be a positive feedback of spreading plants changing the atmosphere (more O2, less CO2, more water vapour) until land and sea were “saturated” with plant cover. In this sense I believe Lovelock’s Gaia has some validity – this drop in temps at the Cambrian may well have been caused by the biosphere.
Fig 2b is artistically compelling, it looks like a sadly majestic sphynx sitting in the desert. His hind-quaters represent the rise of single celled life, his mane and ears the pinnacle of eukaryotic and multicellular lifeforms emergence. His front paws represent the future decline and extinction of life. It reminds me of a poem I learned at prep-school – probably as a punishment for some misdemeanour:
I met a traveller from an antique land
Who said “two vast and trunkless legs of stone
Stand in the desert. Near them on the sand
Half sunk, a shattered visage lies, whose frown
And wrinkled lip and sneer of cold command
Tell that his sculptor well these passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed.
And on the pedestal these words appear
My name is Osymandius, king of kings,
Look on my works ye mighty and despair.
Nothing beside remains, round the decay
Of that colossal wreck, boundless and bare,
The lone and level sands stretch far away.”
Shelley.

April 28, 2010 2:27 pm

phlogiston says:
April 28, 2010 at 2:10 pm
Your above summary skates over the Cambrian. The sharp drop in temperatures at the Cambrian explosion could be a positive feedback of spreading plants
No plants yet. First primitive plants appear on land in the Ordovician. The authors’ do have a comment on the sharp change. No doubt, there are details to be worked out, but the gross picture is compelling [to me at least].

pwl
April 28, 2010 4:04 pm

Girma, Anthony, Leif, other experts,
I’m in a conversation with a climate scientist who has this to say:
“[Girma’s] assertions about the connection (or lack thereof) between CO2 and temperature are using human CO2 emission rates, not total CO2 concentrations in the atmosphere. He should use the latter if he wants to make a quantitative assertion connecting or disproving a connection between the two.” – D.P.
What do you say in response?
Thanks in advance.

pwl
April 28, 2010 4:35 pm

I asked D.P. “Why?” to his suggestion in the above comment: “pwl, April 28, 2010 at 4:04 pm”
His reply was:
“As to the question of why human input is not as appropriate a variable to study here rather than total column CO2: The former is a driving force in changing the amount of CO2 in the atmosphere, but there are huge natural sources and sinks of CO2. From a radiative standpoint, the CO2 that is important is that which is in the atmosphere at a given time.” – D.P.
What say you all?

Girma
April 28, 2010 5:09 pm

pwl (April 28, 2010 at 4:04 pm)
It does matter whether you consider human emission of CO2 or concentration of CO2 in the atmosphere.
There was no change in the global mean temperature pattern in the last 130 years, while there was a large increase (emission or concentration) in CO2 since the 1940s. The global mean temperature pattern for the last 130 years was a single pattern of a combination of linear and sinusoidal functions, as shown in Figure 3 of this article. As a result, the effect of CO2 on global mean temperature is nil.

April 28, 2010 5:34 pm

Girma says:
April 28, 2010 at 5:09 pm
The global mean temperature pattern for the last 130 years was a single pattern of a combination of linear and sinusoidal functions, as shown in Figure 3 of this article. As a result, the effect of CO2 on global mean temperature is nil.
Except if the linear function is just the effect of CO2. Try to plot CO2 against your linear term and report here what the correlation coefficient is.

pwl
April 28, 2010 5:52 pm

Girma, thank you. As I suspected.

pwl
April 28, 2010 5:56 pm

Girma, do you mean “It DOES MATTER” or “It DOES NOT MATTER”?
Thanks.

Editor
April 28, 2010 6:00 pm

From above …
You wrote: You need several more cycles before you have any evidence of sinusoidal behavior. Why not fit a quartic to it, that should fit just as well?
We only have two-cycles of data and a quartic function does not have a period and an amplitude.

In reply, this was stated:
Which is an assumption on your part, you’ve fitted a curve with an assumption that it should be sinusoidal and therefore must have a period and amplitude. However a quartic will fit your data just as well, that’s the problem with picking a random function and fitting it without an underlying physical mechanism.
—…—…—
That is not correct: We have adequate temperature data going back some 4000+ years, with an accuracy varying from 1/2 to 1/4 degree accuracy for the whole period.
Thus, we know absolutely that your “assumed random function” is absolutely wrong: The only valid function describing temeprature is:
(a) Mann’s hockey stick (declining constantly for 900 years, then a sudden sharp rise in 1950 with man’s CO2 contributions),
or
(b) a long sinusoid of 800-900 periodicity and (as pointed out above a steadily DECREASING magnitude!), with this proposed 60 year short cycle adding and subtracting from the long term trend,
or
(c) this paper’s proposed long term constant increase with a short 60 year cycle superimposed on the long term rise.
Either of the last two are adequate to predict the next 30 to 60 year temperature – and need NO underlaying physical explanation to serve as that predictor. As a “real theory” – No, of course not. But we are looking at a way to predict the next ten to thirty years based on the past 2000 years of cyclic temepratures. Permanent or final answer? No.
– But the Mann-GISS-IPCC hockey stick utterly fails in every prediction made since Hansen started in 1980. In fact, Hansen’s (and the liberal/socialist/environmental movements’ IPCC GCM predictions cannot even make their 1980-2000 “back-dated” predictions correct without (1) artificially subtracting (artificially assigned) variable “particles” and “soot” factors to each year’s predicted temperature and then artificially assigning “adjustments” to each year’s corrupted” (er, corrected) original temperature data.
Which, conveniently, is no longer available in raw data form.

Girma
April 28, 2010 6:24 pm

pwl (April 28, 2010 at 5:56 pm)
You wrote, Girma, do you mean “It DOES MATTER” or “It DOES NOT MATTER”?
It does not matter.
Both emission and the concentration have increased drastically since 1940s, but the pattern has not changed since 1880.

pwl
April 28, 2010 7:25 pm

Thanks Girma.

Girma
April 28, 2010 8:34 pm

Leif Svalgaard (April 28, 2010 at 5:34 pm)
You wrote, Try to plot CO2 against your linear term and report here what the correlation coefficient is.
The persistent global linear warming of 0.006 deg C per year is not related to the increase in the CO2 concentration because the concentration is a parabola as shown in the following chart.
Relationship between Global Mean Temperature & CO2

April 28, 2010 8:42 pm

Girma says:
April 28, 2010 at 8:34 pm
The persistent global linear warming of 0.006 deg C per year is not related to the increase in the CO2 concentration because the concentration is a parabola as shown in the following chart.
I should have been more precise. It is said that the warming goes with the logarithm of CO2. So plot against log(CO2). BTW, the concentration cannot be a parabola, because then it would increase as you went further back in time…

Girma
April 28, 2010 9:16 pm

Leif Svalgaard (April 28, 2010 at 8:42 pm)
As shown in the following chart,
Relationship between CO2 and GMTA
The correlation between CO2 and GMTA from 1970 to 2000 appears to be close. However, we now know that the GMTA has a cyclic component as shown in Figure 3 of this article, so this cyclic component is not related to CO2 concentration.
What is left is whether the 0.06 deg C per decade of persistent linear warming is caused by CO2. As this value is much less than the IPCC projections of 0.2 deg C per decade, it can not be catastrophic. Besides, as this linear warming existed from 1880 to 1940 when CO2 concentration was low, it may not be caused by CO2 concentration.

April 28, 2010 9:50 pm

Girma says:
April 28, 2010 at 9:16 pm
However, we now know that the GMTA has a cyclic component as shown in Figure 3 of this article, so this cyclic component is not related to CO2 concentration.
The cyclic component is much smaller than the linear component…

April 28, 2010 10:05 pm

Girma says:
April 28, 2010 at 9:16 pm
However, we now know that the GMTA has a cyclic component as shown in Figure 3 of this article, so this cyclic component is not related to CO2 concentration.
If we simply plot log2(CO2) and Temperature anomalies on the same chart normalized to matching scales, we get a very good correlation:
http://www.leif.org/research/Temp-and-CO2-since-1850.png
This is usually the single [and strongest] argument needed for AGW. What say you?

Girma
April 28, 2010 10:27 pm

Global Warming and Cooling Rates and why IPCC went wrong:
As the cyclic component of the GMTA changes by 0.6 deg C in 30 years, its warming or cooling rate is (0.6/30)*10=0.2 deg C. The linear component of the GMTA has a warming rate of 0.06 deg C per decade. As a result, during the warming phase, the average warming rate is 0.2 +0.06=0.26 deg C per decade, which is equal to (0.77/30)*10=0.26; during the cooling phase, the average cooling rate is 0.2-0.06=0.14 deg C per decade, which is equal to (0.42/30)*10=0.14.
The problem with the IPCC interpretation of GMTA is that it completely ignored the cyclic component and assumed the current warming rate of GMTA is a long-term trend, as shown in Figure 5.

April 28, 2010 10:36 pm

Girma says:
April 28, 2010 at 10:27 pm
something non-responsive
Let me try again:
If we simply plot log2(CO2) and Temperature anomalies on the same chart normalized to matching scales, we get a very good correlation:
http://www.leif.org/research/Temp-and-CO2-since-1850.png
What say you? In response to my plot and the good correlation.

Girma
April 28, 2010 10:44 pm

Leif Svalgaard (April 28, 2010 at 10:05 pm)
GMTA = Linear Anomaly + Cyclic Anomaly
The Cyclic Anomaly has a mean value of zero, so it is not related to CO2 concentration.
As a result, when comparing ln(CO2) with GMTA, the cyclic component must be removed from the GMTA.
If you give me the data for CO2, I will try to draw this graph.

April 28, 2010 11:00 pm

Girma says:
April 28, 2010 at 10:44 pm
As a result, when comparing ln(CO2) with GMTA, the cyclic component must be removed from the GMTA.
No, since it is smaller [as my graph shows] than the cumulated linear trend, the cyclic component doesn’t matter. You can leave it in as I did or take it out, it does not damage the good correlation between dT and log2(CO2). Might even improve it…

April 28, 2010 11:10 pm

Girma says:
April 28, 2010 at 10:44 pm
If you give me the data for CO2, I will try to draw this graph.
Since I have already drawn it, you hardly need to 🙂
Here is how the CO2 data was generated [and some analysis related to the data]:
http://www.leif.org/research/Temp-and-CO2.pdf

April 28, 2010 11:25 pm

Leif Svalgaard says:
April 28, 2010 at 11:10 pm
The URL referred to lacks an ‘l’ at the end; should be:
http://cdiac.ornl.gov/trends/emis/tre_glob.html

April 28, 2010 11:46 pm

It makes very little difference if you use CO2 or log CO2 in this region of the curve.
Here is why – CO2 vs ln(CO2)
http://docs.google.com/View?id=ddw82wws_619c8qs9kfh

pwl
April 29, 2010 12:19 am

Girma,
“This unique characterization of the temperature and CO2 data does not hold over all datasets and can only be made if intrinsic variability and uncertainty in the quantities is largely ignored.” – D.P.
I think he’s saying that the CRU isn’t enough…?
What are your thoughts?

April 29, 2010 1:07 am

stevengoddard says:
April 28, 2010 at 11:46 pm
It makes very little difference if you use CO2 or log CO2 in this region of the curve.
Might as well do it right anyway. Perhaps make it easier to see where we might end up come a doubling or 4X CO2.
The whole exercise is just curve fitting. My point being that the long-term trend is quite accurately expressed as a function of log(CO2) as the AGW crowd wants, and that therefore the exercise cannot be used as a counterargument.

pwl
April 29, 2010 1:08 am

“I understand the few comments regarding physics, but honestly, if $70Billion isn’t enough to even remotely understand the physics, isn’t showing that natural factors dominate the correct first step?” – Michael D Smith, April 26, 2010 at 4:52 am
$70 Billion? They’ve really spend that much? Even if it’s only a 1/10th or 1/70th of that, it’s obscene as:
WOW! EPIC FAIL! Along comes a few people who falsify their entire billion dollar hypothesis industry? No wonder they are pissed. The gold gravy train is about to have a major cooling down trend as the numerous falsifications of the alleged AGW hypotheses stand up to their scalding lava melt tests and garner attention in the process.

Girma
April 29, 2010 3:56 am

What really counts is whether the GMTA data has a cyclical component or not.
Once the existence of a cyclic component in the GMTA is accepted, then the warming due to this cyclic component of 0.2 deg per decade is not permanent and there is no catastrophic global warming, and the effect of CO2 on GMTA is insignificant.
Once the cyclical warming is removed, what is left is a warming of 0.6 deg per century, which fortunately will be cancelled out at the end of this century because this century started when oscillation anomaly was at its maximum and will end near its minimum at 0.41 deg C for 2090 (0.48-0.42+0.77-0.42=0.41, where 0.48 deg C is the GMTA for 2000, -0.42 deg C is for the two cooling phases, and 0.77 deg C is for the one warming phase)
Let us live our life without fear of climate catastrophe. Climate Scientists, please release the world from this fear, as it is completely baseless.

April 29, 2010 7:21 am

Leif,
It is more accurate to use log(CO2) but I was making the point that those counting on the logarithmic function to significantly reduce future warming, are going to be disappointed. We are far past the knee of the T/CO2 curve already.

April 29, 2010 8:20 am

Girma says:
April 29, 2010 at 3:56 am
Once the existence of a cyclic component in the GMTA is accepted, then the warming due to this cyclic component of 0.2 deg per decade is not permanent and there is no catastrophic global warming, and the effect of CO2 on GMTA is insignificant.
No, you cannot deduce that. If the linear trend is permanent [at least on a timescale of a few hundred years, and there are many signs that it may be], then it will eventually completely swamp the insignificant cyclic variation. As you say yourself, the cyclic component is not permanent, and hence we can ignore it, but that does not change the much larger linear trend.
stevengoddard says:
April 29, 2010 at 7:21 am
It is more accurate to use log(CO2) but I was making the point that those counting on the logarithmic function to significantly reduce future warming, are going to be disappointed. We are far past the knee of the T/CO2 curve already.
We don’t know that. One might hope, but ‘hope’ is not knowing.

pwl
April 29, 2010 8:55 am

The conversation with an atmospheric climate scientist that I’m having regarding his alleged AGW hypothesis (which he has yet to define), and specifically about Girma’s above article, on another channel is taking an interesting turn. I find some aspects of his recent response interesting.
“I … [suggest] you look at these:
http://data.giss.nasa.gov/gistemp/graphs/
The global ground and global land-ocean graphs show the same increased slope around 1960 on which is what we expect with the increases in CO2 and is apparently not seen by Girma. Following those, are graphs of global temperature in different latitude bands with the northern latitude band showing a greater warming.
I’ve mentioned the importance of time and scale constants before. We see regular temperature differences over time shorter than we are interested in. This include regular day/night changes, seasonal changes, el nino events of a few years, etc. We see regional changes differences that are balanced by other parts of the world, but sampling of this data is not uniform. We had a winter some 5 degrees cooler than average this last year despite the fact that globally it was one of the warmest on record (Dec 09 was 8th warmest on record, Jan 10 was 4th warmest).
Imagine you want to measure the height of a choppy lake. If you have accurate measurements over every square inch of the surface, you can apply rigorous mathematics to get an accurate measure of the surface. In practice, you can’t measure at all locations at all times, so you have an intrinsically uncertain measurement of what you want: the surface of the lake. Similarly for global temperature.” – D.P.

Girma
April 29, 2010 8:56 am

Leif Svalgaard (April 29, 2010 at 8:20 am)
You wrote, No, you cannot deduce that. If the linear trend is permanent [at least on a timescale of a few hundred years, and there are many signs that it may be], then it will eventually completely swamp the insignificant cyclic variation. As you say yourself, the cyclic component is not permanent, and hence we can ignore it, but that does not change the much larger linear trend.
The linear trend is only 0.6 deg C in a century. Nothing to be scared about. By then man may have started to use fusion energy or something else.

April 29, 2010 9:08 am

Girma says:
April 29, 2010 at 8:56 am
The linear trend is only 0.6 deg C in a century. Nothing to be scared about.
Nobody here is scared. The issue is whether that linear trend will hold. The AGW crowd claims it will not, but will speed up. You ‘analysis’ does not show they are wrong. It is just curve fitting to current data, and has no predictive power.

toby
April 29, 2010 11:29 am

I have mailed Girma with my own analysis of the data, which shows there are other models which are better fits, and which do not include the cyclic component.
A simple time series model (ARIMA(1,1,1)) is shown to be better in all the accepted diagnostics. In fact, a model with cyclic component co-efficent = 0.135 is a better fit than Girma’s model, where his coefficient = 0.3.
Girma is correct when he fitted a linear model and noted the cyclical residuals (indicating residual structure in the data). However, fitting a cosine function is not necessarily the best way to go after that. The models with cyclical component still shows residual structure in the data when the predictions are subtracted from the observations. The ARIMA model does not.
If there are a priori reasons for the cyclical component, then it may have some justification. Otherwise, it should be dropped and an alternative model sought.
There is always danger in predicting results outside the region for which the data was fitted. The ARIMA(1,1,1) shows a static prediction with expanding error bars i.e. the anomaly with in that range if CO2 does not continue to increase.
PS. ARIMA(1,1,1) model is (with y(i) as the ith observation, and y(i)hat as the model estimate of the ith observation):
y(i)hat=y(i-1)+(0.3765)*[y(i-1)-y(i-2)]+(-0.724)*[y(i-1)hat-y(i-1)]
y(1)hat=0
y(2)hat=y1, and the formula takes over from then on.

R. de Haan
April 29, 2010 12:47 pm

Just for the record, Joe Bastardi on the same subject today!
http://www.accuweather.com/video.asp?channel=vblog_bastardi

Girma
April 29, 2010 1:23 pm

Leif Svalgaard (April 29, 2010 at 9:08 am)
You wrote, Nobody here is scared. The issue is whether that linear trend will hold. The AGW crowd claims it will not, but will speed up. You ‘analysis’ does not show they are wrong. It is just curve fitting to current data, and has no predictive power.
Of course it will not hold. The question is how many years are required to see change in the value of the linear warming anomaly. On longer time scale, the linear warming anomaly of the GMTA is a curve. Otherwise, we would not have either MWP or LIA. Probably, the radius of curvature of the GMTA curve is so large that it appears as a straight line when considering two points on the curve only 130 years apart (1880 to 2010). As a result, it is hard to accept the linear warming rate of 0.6 deg C per century that was constant for 130 years will change suddenly in the next couple of decades.
As for your, “It is just curve fitting to current data, and has no predictive power”, I will put money where mouth is and bet $1000 USD that my prediction for GMTA trend for the next ten years is closer to the truth than IPCC’s, where both prediction are shown in Figure 3.

Girma
April 29, 2010 2:20 pm

Conspiracy theory:
Look at Figure 3. Is it possible that in order to exaggerate the global warming after 1970, they made the temperature trend before 1970 appear flat? They may have made this by modifying the data before 1970 by increasing the lower temperatures in the 1890s and decreasing the higher temperatures in the 1950s.

toby
April 29, 2010 3:01 pm

Just like to point out that Figure 1 above seems to differ from Open Mind and the paper of Rahmsdorf, , which seem to give a different version of where the current temperature is in relation to IPCC models.
http://tamino.wordpress.com/2008/03/26/recent-climate-observations-compared-to-ipcc-projections/
http://pubs.giss.nasa.gov/docs/2007/2007_Rahmstorf_etal.pdf
I will leae it for others to point out which is correct.

Girma
April 29, 2010 5:07 pm

toby (April 29, 2010 at 3:01 pm)
Regarding this issue, I give Open Mind and RealClimate glod medal for their superb obfuscation.
RealClimate never allowed me to make a single post!
The mistake (or something worse) is now discovered: the global mean temperature has a cyclic component as shown in Figure 3. With this, AGW does not have any leg to stand on.

April 29, 2010 6:19 pm

Girma says:
April 29, 2010 at 5:07 pm
The mistake (or something worse) is now discovered: the global mean temperature has a cyclic component as shown in Figure 3. With this, AGW does not have any leg to stand on.
AGW doesn’t care about your cyclic component [which by the way is too large, it should be more like 0.2 that 0.3C as you should not draw to the extreme values [those are noise on top of the component], but draw the curve such that about half the data points are above and the other half below. Since a cycle bobs up an down, its effect is nil and any trend is thus not due to any short-term cycle.

toby
April 30, 2010 12:18 am

Girma,
If you read the material I sent you, I hope (but maybe do not expect) that you will agree
– the cyclic component of your model has no justification mathematically in that, while it has an improved fit over a linear model, it is not the best fitting model to the data.
– Improving the cyclic component fit does not help because (a) there is residiual structure in the data meaning that the model is not optimum, and (b) it fits worse for recent years than your model with 0.3 instead of 0.135.
– There has to be a problem with a model whose plot does not lie within the data points, but outside them. See Figure 3.
Since the cyclic component has no mathematical justification, then where does it come from?
I am not sure why you mention RealClimate – I did not. The Open Mind blog, while not of your persuasion, is still the best on the web for discussion of the statistical issues involved in assessing global warming data.
A good test of any model is cross-validation – remove 50% of the points at random, and fit the “best” model. Then assess the model fit to the removed 50%. Try it and see how your model holds up.

Girma
April 30, 2010 5:13 am

toby (April 30, 2010 at 12:18 am)
Totally disagree.
I believe what I see. No obfuscation with extra maths is required. The GMTA has a cyclic component (see Figure 3)

toby
April 30, 2010 7:49 am

Girma,
Ok, we have to agree to differ on the efficacy of the different models. I think the time-series model is justified by the accepted statistical modeling procedures (not “extra mathematics”) and is demonstrated to be superior by conventional model diagnostics.
So, statistics is agnostic on the cyclic component. In my view, it will have to depend on physics for its justification.
One cross-validation method would be to apply the models to the GISS data and see which fits best. Have you considered doing that?

Girma
April 30, 2010 9:24 am

toby (April 30, 2010 at 7:49 am)
Sorry, I don’t trust the GISS’s data (see GISS modification of data)
You wrote, In my view, it will have to depend on physics for its justification.
Is correlation of the GMTA cycle with positive and negative phases of the PDO not adequate?
PDO Phases

toby
April 30, 2010 2:36 pm

Hi, Girma,
I looked at the page you set me too. Certainly, I think using the PDO as a justification is better than hanging a cyclical component on statistical considerations alone.
However, I have reservations. It is true the PDO data does show traces of a 60 year cycle, but too much can be read into that. There are other cycles at work also.
The PDO data displayed at the link is charted in running averages of 60 months so it emphasizes the large-scale structure. If you display it in yearly averages, you see a more complex fine-grained structure. There are shorter oscillations upon which the longer warm-cool-warm oscillations are superposed. For example, 1957 and 1958 were relatively warm years, despite being in a cool long-term oscillation.
To cut a long story short, an ARIMA(1,0,0)(1,0,0)[5] model fitted the data very well. This means an autoregressive component and a 5-year seasonal autoregressive component.
The formula is (data at http://jisao.washington.edu/pdo/ ):
y(i)hat=0.172 + 0.503*y(i-1) + 0.1633[y(i-5)-0.503*y(i-6)]
An ARIMA(1,0,0)(1,0,0)[60] i.e. with a periodicity of 60 years also fitted well, but so did one with a periodicity of 30 (in fact, even better). The 5 year periodicity had the best fit of all. Clearly, there is a lot of noise in the PDO data, and attaching a long cycle to it without taking account of the short cycle effects, is not the best approach. Shorter cycle effects frequently disrupt the longer cycle effects.
The 5-year periodicity is interesting as I found previously a 5-year periodicity in the GISS data, which I thought was due to ENSO. The physical interpretations of these results are:
The autoregressive component comes from the tendency of a warm or cool year to “regress to the mean” and be cooler or warmer than the previous year. Similarly for the periodic autogressive component.
In the CRU data, the “first-order difference” in the ARIMA model removes the non-stationary (trend) in the means, while the moving-average component accounts for random shocks like volcanic eruptions which we know can affect world temperature.
I am not saying these time-series models are ideal, but they do account for the data structure without appealing solely to a linear trend & 60-year cycle, which does exist but seems to be swamped by other effects.
So what does the ARIMA model predict for world temperature? Very unspectacularly, it predicts a fairly static anomaly for the next 5 years, with an predicted value of 0.41 (2009 CRU value = 0.43), with a 95% confidence interval of [0.14, 0.68] in 2015. However, that assumes 20th century conditions (no further CO2 in the atmosphere).

Girma
April 30, 2010 4:01 pm

Thanks toby
We have to wait and see who is right. No one will fail to see the increasingly freezing and snow covered winters in the coming couple of decades. I dearly hope the other camp does not say it is because of global warming.
As far as I am concerned, the effect of CO2 on GMTA is zilch, nil, naught.
I repeat:
Figure 4, with high correlation coefficient of 0.88, shows the important result that the observed GMTA can be modeled by a combination of a linear and sinusoidal pattern given by Equation 3. This single GMTA pattern that was valid in the period from 1880 to 1940 was also valid in the period from 1940 to 2000 after about 5-times increase in human emission of CO2. As a result, the effect of human emission of CO2 on GMTA is nil. Also, IPCC’s conclusion of “accelerated warming” shown in Figure 5 is incorrect.

phlogiston
April 30, 2010 4:10 pm

Leif Svalgaard says:
April 29, 2010 at 6:19 pm
“Girma says:
April 29, 2010 at 5:07 pm
The mistake (or something worse) is now discovered: the global mean temperature has a cyclic component as shown in Figure 3. With this, AGW does not have any leg to stand on.
AGW doesn’t care about your cyclic component [which by the way is too large, it should be more like 0.2 that 0.3C as you should not draw to the extreme values [those are noise on top of the component], but draw the curve such that about half the data points are above and the other half below. Since a cycle bobs up an down, its effect is nil and any trend is thus not due to any short-term cycle.”
But are you agreeing with this AGW position or not? – this is not clear from your Mona-Lisa like detatchment. This issue touches on a major contradiction and inconsistency in the AGW position. Multidecadal oscillation in global mean temp, particularly the 1970-2005 half-cycle rise, is eagerly trousered by AGWers in an unending stream of studies showing a multitude of climate indicators and biomarkers having increased from 1970 to the end of the century. This alone is regarded as proof of AGW.
No, AGW does very much care about these cycles, most of all the 1970-2005 half-cycle on which much of their supporting data is based.
But when a longer view is taken and the oscillations become inconvenient, such as the current inflection toward cooling or earlier warming-cooling episodes, suddenly they come over all statistical and insist all such oscillations must be ironed out, and “underlying trends” looked for. (These underlying trends are in turn also likely to be longer term oscillations, e.g. recovery from the LIA, but who cares as long as they appear to support the party line?)
“Since a cycle bobs up an down, its effect is nil..” Fine. Then you will agree with me that the significance of all the thousands of studies showing “global warming indicators” increased over the last 2-3 decades, in indeed “nil”. Such as melt of Arctic ice, biological cycles and phenology (times of flowering or hatching) etc.
Admittedly it more the press and politicians (and less scrupulous scientists) than the geophysical community who are surfing on the 1970-2005 wave to promote AGW, and are about to wipe-out in some style.
BTW how do you do italics in WUWT?
[REPLY – i at the start and /i to close. Both to be surrounded by lesser-than and greater-than signs. ~ Evan]

phlogiston
April 30, 2010 4:25 pm

toby
When I look at sea surface temps globally, or troposphere above ocean, from HAD-CRUT, UAH, NCDC (multi-year smoothed), I see a series of 7-8 year jumps: 197?-1985; 1985-1993; 1993-2001; 2001-2008. Is this just me? If these jumps are a real pattern then the el-Nino elated rise in 2009-2010 and current levelling off are expected. The current “jump” should come down to baseline around 2016. What seems to matter is whether the jumps end warmer or cooler than they started. Bob Tisdale talks about the series of ENSO el Nino-La Nina cycles each one ending with a positive or negative heat budget.

toby
May 2, 2010 12:55 am

Girma,
Stephen Jay Gould, in one of his essays, recalled a mid-20th century Professor of Geology who was a vocal and vociferous opponent of theories about continental drift. The, when the evidence finally became incontrovertible and the paradigm shifted, the Professor “cheerfully re-did his life’s work”. I think a few people will be re-doing a lifetime’s work in a few years, cheerfully or not.
Phlogiston,
I have not had the time to look at that data. However, eyeballing charts to see patterns is a good way to explore data for possible models, but the models have to be confirmed by the statistical analysis. The human mind is notorious for seeing patterns that do not exist … classic example: the constellations and astrology.

toby
May 2, 2010 3:15 am

Phlogiston,
I looked at the plots of troposphere on Wood for Trees, and you there are apparent cycles. A model that included seasonality of 5 to 3 years would possibly fit. But the increasing trend is also striking. It would be interesting to see the predictions of such a model.

phlogiston
May 2, 2010 11:53 am

this bit in normal text.
this bit in italics
and this bit normal again.

phlogiston
May 2, 2010 11:54 am

got it – thanks!

phlogiston
May 2, 2010 12:05 pm

toby says:
May 2, 2010 at 12:55 am
You are right about the astrology / constellations comment – the eye-brain system is a very avid pattern-seeker and can easily fix on spurious patterns. 8 years might correspond to a pair of el Nino-La Nina cycles although these are far from regular. Nicola Scafetta calculated an ocean heat time constant parameter of about 7 years (others calculated it as 5). Some posters have pointed to a possible 4-year cycle coming from solar system centre-of-gravity wobble arising from large planet alignment, but this is controversial – this planetary wobble is even advanced as a cause of ENSO. A purely empirical prediction of such a (hugely speculative) 8-year jump scenario would be the next global temperature minimuum around 2016. A few months back Habibullo Abdussamatov from Russia predicted future cooling based on solar and other factors, that it would start from about 2013.

Girma
May 2, 2010 4:45 pm

You would not believe the following!
NASA FACTS – Global Warming – April 1998, NF-222
… For example, in the early 1970’s, because temperatures had been decreasing for about 25 to 30 years, people began predicting the approach of an ice age! For the last 15 to 20 years, we have been seeing a fairly steady rise in temperatures, giving some assurance that we are now in a global warming phase.
Dear NASA, because temperatures have been increasing for about 25 to 30 years, why are people now began predicting catastrophic global warming?