Empirical Model Of The Global Mean Surface Temperature

Cooling of the Multidecadal Cyclic GMST until about 2030’s suggests La Nina conditions will dominate in the next twenty years.

Guest post by Girma Orssengo, PhD

IPCC’s climate model prediction for a global warming of about 0.2 deg C per decade for the next two decades is contrary to the observed climate pattern.

In the following graphs that show climate data analysis results, the Observed Global Mean Surface Temperature (GMST) shown in Graph “a” has an oscillating Residual GMST of +/- 0.2 deg C as shown in Graph “b”, and a Multidecadal Cyclic GMST of +/- 0.1 deg C as shown in Graph “e”.

As a result, because of these two oscillating components of the Observed GMST, it is incorrect for the IPCC to claim a constant warming rate of 0.2 deg C per decade that lasts for two decades.

Note that for the parameters of the model given in Equation 1, the Residual GMST from 1885 to 2011 shown in Graph “b” has zero mean and zero trend. The result shown in Graph “e” indicates the cooling of the Multidecadal Cyclic GMST until about 2030s. This result suggests La Nina conditions will dominate in the next twenty years. Finally, Graph “f” demonstrates there was no change in the climate pattern before and after mid-20th century, contrary to IPCC claim.

clip_image002

clip_image004

clip_image006

Observed GMST (Graph a) = Residual GMST (Graph b) + Model Smoothed GMST (Graph c)

Model Smoothed GMST = a*Cos[2*Pi*(Year-1910)/60] + b*(Year-1910)^2 + c*(Year-1910) + d

Where a = -0.1050, b = 3.598*10^(-5), c = 3.27*10^(-3), d = -0.345 (Equation 1)

Secular GMST = b*(Year-1910)^2 + c*(Year-1910) + d (Equation 2)

MultiDecadal Cyclic GMST = a*Cos[2*Pi*(Year-1910)/60] (Equation 3)

Advertisements

  Subscribe  
newest oldest most voted
Notify of
DocMartyn

I do like the fit, but I would prefer using a Temp minus sine wave vs the log([CO2]) plot.

Kasuha

I’d call this “yet another regression”. What I’m missing is explanation of physical relevance of chosen regression components, particularly the 60-year cycle and quadratic baseline.

Joachim Seifert

Girma: Here we have again empirical statistics of the 60-year
trisynodic Scafetta cycle, which we discussed in January/ February, when
our Willis got mad on “statistical curve fitting”, demanding its astronomical
climate forcing background to be put on the table….
I will do it coming spring, because this cycle is an important decadal
Holocene climate forcing cycle….JS

This doesn’t allow for the sudden and deep solar slowdown which has begun, and is likely to reach a nadir around 2035. It is a non-linear non-sinusoidal interregnum which occurs on a complex cycle. For those who don’t think solar variation affects climate much – keep watching.

Tony B (another one)

I think a little more explanation might be helpful….

Manfred

The IPCC define what it is to be between a rock and a hard place.

It’s always been a problem for me that the period 1910-1940 showed the same or faster temperature increase than the 1979-1998 period. Fully half the temperature increase of the 20th century happened before 1950. Since we all know that the co2 output of the world greatly accelerated post 1950 and the great industrialization of the world it makes no sense that the temperature increase prior to 1950 should be the same as before 1950. That would imply that co2 is not the cause of the warming but something else. Or that co2 took over after the mysterious unexplained warming from 1910-1940 stopped. Since we now know of a 1000 year cycle where temperatures peak roughly every 1000 years it seems that maybe the warming we are getting now and prior to 1950 is mostly or all related to that phenomenon.

richardscourtney

Yes, I and others have been saying this (including on WUWT) for years.
Richard

Now you need to plot CO2 against graph d then you will get something what Dr. Vaughan Pratt did on JC’s Climate etc some time last year. He claims that ‘graph d is CO2 contribution.
Do you have a different attribution?
Dr. Pratt’s problem is the same one you encounter by analyzing, in climatic terms, too short data set.
Now if you turn to the CET and consider 350 instead of 110 year long data set than the ambiguity would disappear:
http://www.vukcevic.talktalk.net/CET-NV.htm
here oscillating curves have true observational (empirical) properties, they are actual spectral components derived from the actual data set.
I do not see anything noteworthy in there that can be attributed to recent CO2 increase that didn’t occur in the low CO2 era.

Robbie

More La Niña conditions mean more droughts in North America. A return to the Dust Bowl.
That’s bad news. Very bad news if this is true.

lgl

It’s much worse than that. You have to include the 20 yr and 9 yr cycles too.
http://virakkraft.com/Temp-future.png

jorgekafkazar

Looking at the graphs, it would appear that the rate of warming in raw data Graph “a” from 1910 to 1940 is steeper than the rate from 1960 to 2010. In Graph “c,” however, after mucking about with residuals and model formation, the latter period’s slope is slightly steeper than the former’s.
Also, I note that, according to the graph titles, Graph “c” is derived from Graph “a” minus Graph “b,” and Graph “b” is derived from Graph “a” minus Graph “c.” A labeling error somewhere. Otherwise interesting, possibly significant. It would be nice if we had a longer set of observations that Hansen hasn’t diddled with.

Johnny

It’s interesting you mention temp rise looks slightly lower between 1910-1940 vs the time after 1950 because temp charts I’ve seen for years showed that the 1910 -1940 went up faster. However the giss and other source for data keep adjusting the temperature in the past. So now it seems that 1910-1940 is slower than after 1950 and also that the decline in temperature between 1945-1975 seems to have become more of a flat period than a period of decline. Why they would feel the need or how they justify mucking with past temperatures is beyond me. It is interesting the modifications of past temperature 100% of the time reduce those temperatures. How amazingly coincident with the theory that co2 is the cause of all warming. We’ve seen over and over again in other sciences and disciplines that scientists of all types are conciously or subconsciously always destined to make experimental bias that confirms their theory. I have seen numerous times that when there are errors In the temperature data that make it look like temperatures deceased for any period of time intense scrutiny is applied to figure out how to create a way of discarding that data yet when temperatures come out higher they recieve no scrutiny at all. This has been so egregious that at times it’s been hard to believe. For instance a few years ago the temperature of the world was reported hitting a new peak. When someone glanced at the data they found that they had inadvertently copied in the soviet unions temperature from July into August and September. Since the temperature can be 10-20 degrees colder a few months later and since Russia is a very large land mass it had the effect of massively raising temps for the whole planet for that year until someone noticed the fact that Russia was incredibly hotter than it seemed it should be and pointed this out the “scientists” at NASA apparently didn’t notice that the whole country of Russia was 10 degrees Warner than it should be in their data. Talk about extreme evidence of experimenter bias. Normally the argument is that scientists cross check the work of other scientists. That there is a “competition” of scientists that weeds out errors of this type. So then why did it take a non-academic lay person to find this large error? Again proof that the scientific community is not policing itself. This kind of thing happens regularly and is almost never found by other scientists but by lay people. The temperature record would not have to be manipulated much for a trend to become significant or unsignificant. So the need for accuracy of this data is paramMount

Tom in Indy

TonyB
I think a little more explanation might be helpful….
I agree. I believe there is an unwritten law that links the level of explanation with the level of comprehension. In other words, if you can’t explain your results, then you probably don’t understand them.

Girma, you and I have discussed this before on Climate Etc. It follows from your analysis that there is no CO2 signal in the temperature/time graph that is detectable above the antural noise. Therefore, by definition, the total climate sensitivity for CO2 is indistinguishable from zero; since no signal is detectable

Michael D Smith

I get similar results but use an x^2 factor instead of logarithmic. Cooling until 2030±5 depending on the dataset. The wavelength changes quite a bit depending on whether you are looking at global, regional, or local areas. Mine analyzes until it reaches minimum error using about 9 digits, not that it is that precise.
Check out the one on sea level too – declining until 2019! But this was using January data, before the University of Colorado decided such a result was just a little too much to bear. With Envisat out of the way, it makes their job a lot easier.
http://naturalclimate.wordpress.com/

Dr Burns

Girma,
Place realistic error bands on the data and all your graphs disappear in the haze.
Even recording accuracy (+/- 0.5 for most of the data) swamps any trends.

Ian W

Robbie says:
September 3, 2012 at 1:12 pm
More La Niña conditions mean more droughts in North America. A return to the Dust Bowl.
That’s bad news. Very bad news if this is true.

Well I hate to be the bearer of bad tidings.. but if you look at the Unisys SST anomalies map http://weather.unisys.com/surface/sst_anom_new.gif you will see a plume of upwelling cold water from the coast of Peru out into the Pacific. That looks very much like a La Nina. I think that the Nino 3.4 metrics have been fooled by the huge pool of cold water to the North of the Nino boxes. It certainly does not look like an El Nino.

Steven Mosher says: September 3, 2012 at 2:15 pm
http://xkcd.com/687/
……………………
Hi Steven
Looks very familiar
CET = (pi) f (a, b, c)
a = algorithm for width of English Channel due to the continental drift
b = baroclinic pressure at the Earth core (by proxy of Earth’s magnetic field)
c = ciclo solare

X Anomaly

Grima,
extrapolate graph c into a cycle and predict “something else”; predict something that has already happened (tick), then apply it to more data (do it again).
“Make something else come out right, in addition” -Feynman
Other wise it’s just cargo cult crap (like the hockey stick). Time to do something more useful with a PhD.

Fernando(in Brazil)

X anomaly,
Your opinion might carry some weight if you could spell Girma’s name correctly.

kadaka (KD Knoebel)

Caption check:
Graph d = Graph c – Graph e
Graph e = Graph c – Graph d
Therefore:
Graph e = Graph c – (Graph c – Graph e)
Graph e = Graph e
So Graph e has no relation to Graph c, nor to Graph d, only to itself.
Is that what you were trying to say?

P. Solar

This article, looking at the effects of Hadley “corrections” to SST
http://judithcurry.com/2012/03/15/on-the-adjustments-to-the-hadsst3-data-set-2/
contains this graph:
http://curryja.files.wordpress.com/2012/03/icoads_monthly_adj0_40-triple1.png which shows a similar fit but using several periods not just 60 year period.
The periods and starting years are determined by non-linear regression, and are thus determined by the data.
Exponential increase in CO2 would only produce a linear (or likely less) increase in temp. I see no reason to fit a quadratic.
Looking at rate of change (dT/dt) is most informative, since it removes the constant base line temp , which is arbitrary in the case of “anomalies”. The “constant” 0.42 K/c in dT/dt being a linear rise in temperature.
tallbloke says:
September 3, 2012 at 12:52 pm
>> This doesn’t allow for the sudden and deep solar slowdown which has begun, and is likely to reach a nadir around 2035. It is a non-linear non-sinusoidal interregnum which occurs on a complex cycle. For those who don’t think solar variation affects climate much – keep watching.
>>
Well Girma’s plot doesn’t but the three terms in the dT/dt plot may well catch something like it. The system probalby is “non-linear non-sinusoidal ” in reality but 164 + 64 + 21 year cycles seems to give something near to what may be expected from very low cycles 24 and 25.
The d2T/dt2 plot is also interesting but requires more commentry that I’ll omit for brevity here.

LazyTeenager

Looks like a lame curve fitting exercise without any physical basis.
I could come up with a dozen function forms, fit the function to the data and come up with tiny residuals. It would all be meaningless.
It is only meaningful if there is a physical basis for the functional form.

old construction worker

So, another computer model say cooling for the next 15 years. Somehow, I think “Mother Nature” will throw us a curve ball.

kadaka (KD Knoebel)

From LazyTeenager on September 3, 2012 at 4:16 pm:

I could come up with a dozen function forms, fit the function to the data and come up with tiny residuals.

Good idea, let’s see what the possibilities are. Post your work when done.

george e smith

“””””…..Empirical Model Of The Global Mean Surface Temperature…..”””””
So we have an “empirical” (made up) model of the GMST, something we have no believable idea of. Or alternatively, of which, we have no believable idea.
Back in the 1960s, there was a famous paper on an “empirical” model of the fine structure constant alpha, or more strictly of 1/alpha, known to be around 137.
The “empirical” model was; 1/alpha = 4thrt((pi)^a.b^c.d^e.f^g) where a through g are small integers, not necessarily different. Well the paper gave the actual values of a through g .
The “empirical” model’s predicted; excuse me, projected value, agreed with the best peer reviewed experimentally measured value for 1/alpha to less than 2/3rds of the standard deviation of that measured value.; which happens to be known to a few parts in 10^8. I would say that’s a pretty “empirical” agreement with reality.
Like Dr Orssengo’s “empirical” model, this fine structure model had no known connection to the physical universe; but it obviously was correct, because it was accurate to a few parts in 10^8, so it was wildle embraced, although no-one could discern how it connected to reality.
A month later, a computer geek, published a list of several other sets of values (small integers) for a,b,c.d.e,f,g, which also led to 1/alpha to less than the standard deviation; currently about 4.5E-8.
One of those was twice as accurate as the earlier result.
A month after that, a more theoretical geek, pubished a model of an N-dimensional sphere whose radius was 1/alpha, and the solutions were the lattice points in this N-space for different a through g that lay within a thin shell of inner and outer radii, that were less than or greater than 1/alpha by the standard deviation increment; and he derived a complete list of at least 8 values that fit the “empirical” model.
So if you don’t think you can get a believable “empirical ” result by simply f*****ing around with numbers; think again; you CAN !

Bruce of Newcastle

I’ll mention to Dr Orssengo these two graphs where HadCRUT3 since 1850 is detrended by the quadratic y = 0.000028*(x-1850) – 0.41.
The next logical step is to incorporate the correlation of previous solar cycle length to temperature. After combining this and the cycle shown in Dr Orssengo’s graph (e) the residual appears to fit well with Lindzen & Choi’s value for 2xCO2.

Bruce of Newcastle

Sorry, slight correction. The quadratic should be y = 0.000028*(x-1850)^2 – 0.41.

Maus

LazyTeenager: “Looks like a lame curve fitting exercise without any physical basis. ”
Yep. Just like Newton’s gravity, Planck’s quantums, Einstein’s relativity, the Higgs Boson, Physics, Astronomy, Psychology, Sociology, Climate Science, MMTS adjustments, TOBS adjustments, treemometers and practical proxy practices. Physics may seem particularly queer in that list. But physics is based on empirical modelling tasks taken to Platonic abstraction; for better or worse.
Now, if you’re asking for materialist explanans then that’s an entirely different affair and important only to Philosophers and Metaphysicians (They have tonics.). For otherwise it’s a case of ‘E si pur muove’. Though the work in the OP is still behind the curve of the resident Vukcevic when speaking of Fourier analysis of climatological cycles.

See Klyashtorin and Lyubushin, 2007. While modern “Atmospheric CO2 increase as a lagging effect of ocean warming” may have foundered on the shoals of statistics lacking physical basis, my sense is this one will not. It was from fish, after all, that we learned of PDO.

joeblack25

This posting is handwaving without even bothering to make clear how the results were derived. Since the author has a Ph.D., the author should know that there is published literature based on unit root time series analysis used to determine whether the global temperature anomaly (GTA) time series is stationary and whether there is a upward trend in the series. The author should know the following peer reviewed papers on this topic:
http://www.uoguelph.ca/~rmckitri/research/warming.pdf
http://www.buseco.monash.edu.au/ebs/pubs/wpapers/2011/wp4-11.pdf
http://www.earth-syst-dynam-discuss.net/3/561/2012/esdd-3-561-2012.html

Mooloo

Looks like a lame curve fitting exercise without any physical basis.
No supporter of GCMs should object to curve fitting.
While the writers of GCMs give explanations for their epicycles, that doesn’t make those explanations true. Occam’s Razor and all that.

I’d call this “yet another regression”. What I’m missing is explanation of physical relevance of chosen regression components, particularly the 60-year cycle and quadratic baseline.
The purpose of an empirical model is to represent the observed data as best as possible. It is clear that in Graph “f” 100% of the observed data is bounded by the model. That is the purpose of an empirical model rather than explanation of the physics.

Tallbloke says
This doesn’t allow for the sudden and deep solar slowdown which has begun, and is likely to reach a nadir around 2035. It is a non-linear non-sinusoidal interregnum which occurs on a complex cycle. For those who don’t think solar variation affects climate much – keep watching.
Henry@Tallbloke or anyone who can help
On the deceleration of maximum temperatures I discovered it is an ac wave, but I don’t know how to do the plot (best fit). See:
http://wattsupwiththat.com/2012/08/23/agu-link-found-between-cold-european-winters-and-solar-activity/#comment-1067753
Anybody here who can help, please?

Nylo

A model is only useful if it allows predicting future behaviour. I see no predictions by the author that could be later falsified by the real outcome.

Girma

Nylo
A model is only useful if it allows predicting future behaviour. I see no predictions by the author that could be later falsified by the real outcome.
The model established a pattern as shown in Graph “f”. From this graph, It is easy to predict the climate if the pattern continues => Little warming in the next 15 years.

Maus says: September 3, 2012 at 7:38 pm
…….the resident Vukcevic when speaking of….
the resident Vukcevic here on the WUWT is well behind the curve of himself , but there is always a hope that he may catch-up soon with his own private research.
Nylo says:
September 3, 2012 at 11:58 pm
A model is only useful if it allows predicting future behavior
Doing predictions is froth with danger.
I do extrapolation. When it fails, it is fault of the used data set limitations such as length, resolution, compilation and many other factors I can’t be held accountable for.
🙂

Really? But the output of the sun is still falling and the sun is the ONLY source of heat we have to drive climate.

P. Solar

HenryP says:
Henry@Tallbloke or anyone who can help
On the deceleration of maximum temperatures I discovered it is an ac wave, but I don’t know how to do the plot (best fit). See:
http://wattsupwiththat.com/2012/08/23/agu-link-found-between-cold-european-winters-and-solar-activity/#comment-1067753
Anybody here who can help, please?
Try gnuplot. It has fit command for non-linear least sqr to fit your desired function and very flexible ways to plot it all. It takes a bit of reading to learn how to get the best from it but it’s an effort well worth time. It goes well beyond just plotting once you master it.

P. Solar

vukcevic says:
Doing predictions is froth with danger.
I do extrapolation. When it fails, it is fault of the used data set limitations such as length, resolution, compilation and many other factors I can’t be held accountable for.
🙂
Data does not extrapolate. Only fitting a model can do that. Assumptions about the suitability of the model, the method of determining the fitted parameters and the assumption that the model will still be valid beyond the range of the source data are also key factors.
All of which can be confounded by poor quality or manipulated data sets. So the whole exercise is indeed fraught with froth. 😉

P. Solar

Girma says: The model established a pattern as shown in Graph “f”. From this graph, It is easy to predict the climate if the pattern continues => Little warming in the next 15 years.
Your residuals are about twice the amplitude of your 60y cycle. So either noise is twice as big as your signal or there are other more significant factors you are not accounting for.
Why do you assume the modelled part will dominate the larger residuals you do not capture in the model?
Why did you chose to fit a parabola and what could cause such a variation?
BTW, I agree that what you suggest is likely but don’t see that it follows from what you show here.

P. Solar

LazyTeenager says:
“It is only meaningful if there is a physical basis for the functional form.”
So empirical tide predictions are “meaningless” then. Nonetheless, they have proved incredibly reliable the world over for more than a century.
I suppose being LazyTeenager means you don’t need to think before posting.

P. Solar says: September 4, 2012 at 3:22 am
Data does not extrapolate
Agree, perhaps I should have been less circumspect, the first part of my post relates to Maus’ reference to the Fourier analysis, so I continued with ‘extrapolation’ comment, which often used once spectral content of the data is found, to show what that the extrapolation either forward or back in time may reveal. Using MS Word’s auto spellchecker is also ‘fraught with froth’

A model is only useful if it allows predicting future behaviour. I see no predictions by the author that could be later falsified by the real outcome.
Spoken like a chartist and completely wrong. The major benefits of good models is to help one understand how systems work.

P. Solar
Your residuals are about twice the amplitude of your 60y cycle. So either noise is twice as big as your signal or there are other more significant factors you are not accounting for.
Good question.
Though the magnitude of the Multidecadal Cyclic GMST is only half of the Residual GMST, it appears that the Multidecadal Cyclic GMST drives the Residual GMST. For example, in Graph “f”, in the 1910s, the Multidecadal Cyclic GMST was below the secular trend curve and the Residual GMST was also near the bottom of the GMST band. In contrast, in the 1940s, the Multidecadal Cyclic GMST was above the secular trend curve and the Residual GMST was also near the top of the GMST band. In the 1970s, the Multidecadal Cyclic GMST was below the secular trend curve and the Residual GMST was also near the bottom of the GMST band.
Therefore, it appears that the Multidecadal Cyclic GMST drives the Residual GMST.

Check out my paper on Central UK Max Temp vs Sunshine Hours at NothingSettledNothingCertain.com (also at Tallblokes Talkshop): by comparing Bright Sunshine Hours to Max Temperatures from 1930, I derived very much the same thing, except that instead of the curivlinear trend I got a linear trend (which doesn’t look unreasonable here, either). The AMO/PDO cyclicty, loaded onto a sunshine-hours linear rise accounted for all but 0.1C/century, which could easily be UHIE or landuse related.
I don’t think there is global maximum sunshine hours going back to 1920. Too bad: any increas in bright sunshine is a decrease in cloud cover. Ipso facto, Lord Monford might say.
Of course you have to explain why there has been an historical reduction in cloud cover, but perhaps the warmists could say that more moisture causes more rain which causes less clouds at some point in the day which leads to more bright sunshine hours in the records: there’s a political spin for everything.

george e smith

“””””…..Eli Rabett says:
September 4, 2012 at 5:41 am
A model is only useful if it allows predicting future behaviour. I see no predictions by the author that could be later falsified by the real outcome.
Spoken like a chartist and completely wrong. The major benefits of good models is to help one understand how systems work……”””””
Well george e. smith believes that the major benefit of good models is to help one understand how the models work. He thinks we should be so lucky as to have real systems behave the same as our models. In the case of the GCMs, he believes the climate system does not behave the same as the models; or else we wouldn’t need 13 of them, or whatever the count is now up to.