Guest Post by Willis Eschenbach
In my earlier post about climate models, “Zero Point Three Times The Forcing“, a commenter provided the breakthrough that allowed the analysis of the GISSE climate model as a black box. In a “black box” type of analysis, we know nothing but what goes into the box and what comes out. We don’t know what the black box is doing internally with the input that it has been given. Figure 1 shows the situation of a black box on a shelf in some laboratory.
Figure 1. The CCSM3 climate model seen as a black box, with only the inputs and outputs known.
A “black box” analysis may allow us to discover the “functional equivalent” of whatever might be going on inside the black box. In other words, we may be able to find a simple function that provides the same output as the black box. I thought it might be interesting if I explain how I went about doing this with the CCSM3 model.
First, I went and got the input variables. They are all in the form of “ncdf” files, a standard format that contains both data and metadata. I converted them to annual or monthly averages using the computer language “R”, and saved them as text files. I opened these in Excel, and collected them into one file. I have posted the data up here as an Excel spreadsheet.
Next, I needed the output. The simplest place to get it was the graphic located here. I digitized that data using a digitizing program (I use “GraphClick”, on a Mac computer).
My first procedure in this kind of exercise is to “normalize” or “standardize” the various datasets. This means to adjust each one so that the average is zero, and the standard deviation is one. I use the Excel function ‘STANDARDIZE” for this purpose. This allows me to see all of the data in a common size format. Figure 2 shows those results.
Figure 2. Standardized forcings used by the CCSM 3.0 climate model to hindcast the 20th century temperatures. Dark black line shows the temperature hindcast by the CCSM3 model.
Looking at that, I could see several things. First, the CO2 data has the same general shape as the sulfur, ozone, and methane (CH4) data. Next, the effects of the solar and volcano data were clearly visible in the temperature output signal. This led me to believe that the GHG data, along with the solar and the volcano data, would be enough to replicate the model’s temperature output.
And indeed, this proved to be the case. Using the Excel “Solver” function, I used the formula which (as mentioned above) had been developed through the analysis of the GISS model. This is:
T(n+1) = T(n)+λ ∆F(n+1) * (1- exp( -1 / τ )) + ΔT(n) exp( -1 / τ )
OK, now lets render this equation in English. It looks complex, but it’s not.
T(n) is pronounced “T sub n”. It is the temperature “T” at time “n”. So T sub n plus one, written as T(n+1), is the temperature during the following time period. In this case we’re using years, so it would be the next year’s temperature.
F is the forcing, in watts per square metre. This is the total of all of the forcings under consideration. The same time convention is followed, so F(n) means the forcing “F” in time period “n”.
Delta, or “∆”, means “the change in”. So ∆T(n) is the change in temperature since the previous period, or T(n) minus the previous temperature T(n-1). ∆F(n), correspondingly, is the change in forcing since the previous time period.
Lambda, or “λ”, is the climate sensitivity. And finally tau, or “τ”, is the lag time constant. The time constant establishes the amount of the lag in the response of the system to forcing. And finally, “exp (x)” means the number 2.71828 to the power of x.
So in English, this means that the temperature next year, or T(n+1), is equal to the temperature this year T(n), plus the immediate temperature increase due to the change in forcing λ F(n+1) * (1-exp( -1 / τ )), plus the lag term ΔT(n) exp( -1 / τ ) from the previous forcing. This lag term is necessary because the effects of the changes in forcing are not instantaneous.
Figure 3 shows the final result of that calculation. I used only a subset of the forcings, which were the greenhouse gases (GHGs), the solar, and the volcanic inputs. The size of the others is quite small in terms of forcing potential, so I neglected them in the calculation.
Figure 3. CCSM3 model functional equivalent equation, compared to actual CCSM3 output. The two are almost identical.
As with the GISSE model, we find that the CCSM3 model also slavishly follows the lagged input. The match once again is excellent, with a correlation of 0.995. The values for lambda and tau are also similar to those found during the GISSE investigation.
So what does all of this mean?
Well, the first thing it means is that, just as with the GISSE model, the output temperature of the CCSM3 model is functionally equivalent to a simple, one-line lagged linear transformation of the input forcings.
It also implies that, given that the GISSE and CCSM3 models function in the same way, it is very likely that we will find the same linear dependence of output on input in other climate models.
(Let me add in passing that the CCSM3 model does a very poor job of replicating the historical decline in temperatures from ~ 1945 to ~ 1975 … as did the GISSE model.)
Now, I suppose that if you think the temperature of the planet is simply a linear transformation of the input forcings plus some “natural variations”, those model results might seem reasonable, or at least theoretically sound.
Me, I find the idea of a linear connection between inputs and output in a complex, multiply interconnected, chaotic system like the climate to be a risible fantasy. It is not true of any other complex system that I know of. Why would climate be so simply and mechanistically predictable when other comparable systems are not?
This all highlights what I see as the basic misunderstanding of current climate science. The current climate paradigm, as exemplified by the models, is that the global temperature is a linear function of the forcings. I find this extremely unlikely, from both a theoretical and practical standpoint. This claim is the result of the bad mathematics that I have detailed in “The Cold Equations“. There, erroneous substitutions allow them to cancel everything out of the equation except forcing and temperature … which leads to the false claim that if forcing goes up, temperature must perforce follow in a linear, slavish manner.
As we can see from the failure of both the GISS and the CCSM3 models to replicate the post 1945 cooling, this claim of linearity between forcings and temperatures fails the real-world test as well as the test of common sense.
w.
TECHNICAL NOTES ON THE CONVERSION TO WATTS PER SQUARE METRE
Many of the forcings used by the CCSM3 model are given in units other than watts/square metre. Various conversions were used.
The CO2, CH4, NO2, CFC-11, and CFC-12 values were converted to w/m2 using the various formulas of Myhre as given in Table 3.
Solar forcing was converted to equivalent average forcing by dividing by 4.
The volcanic effect, which CCSM3 gives in total tonnes of mass ejected, has no standard conversion to W/m2. As a result we don’t know what volcanic forcing the CCSM3 model used. Accordingly, I first matched their data to the same W/m2 values as used by the GISSE model. I then adjusted the values iteratively to give the best fit, which resulted in the “Volcanic Adjustment” shown above in Figure 3.
[UPDATE] Steve McIntyre pointed out that I had not given the website for the forcing data. It is available here (registration required, a couple of gigabyte file).
Discover more from Watts Up With That?
Subscribe to get the latest posts sent to your email.
@JoelShore “not just clouds but also greenhouse gases absorb (and re-emit) outgoing terrestrial radiation”
Seeing that land and water absorb heat energy in the sun, and that clouds block both incoming and outgoing radiation, it is perfectly feasible that the “warmer” surface could be created with ZERO greenhouse gases. So would anyone dare run a GCM allowing for this effect and a ZERO GHG effect (not counting clouds as gases).
Willis,
a funny trick. You reproduced the temperature data T(n) mainly by themselves. Your formula for T(n+1) is a hidden autoregressive process of order 2, using the proceeding data T(n) and T(n-1) and modified by a “disturbing term” containing the external forcings.
You can get an even higher correlation factor r of 0.997 if you set all forcings to zero and optimize the coefficients for T(n) and T(n-1) independently.
“You can get an even higher correlation factor r of 0.997 if you set all forcings to zero and optimize the coefficients for T(n) and T(n-1) independently.”
And then, …….voila!
http://rocketscientistsjournal.com/2010/03/sgw.html
Jeff Glassman:
“IPCC’s (HadCRUT3) smoothed model for Earth’s temperature has a noise power of 0.0782^2 = 0.00614. Compared to the noise power in the original annual data, 0.2392^2 = 0.0573, smoothing reduces the variance in the temperature data by 89.3%. The noise power in the two-stage estimate from the Sun is 0.110 ^2 = 0.0120, a variance reduction 79.0%. The Sun provides an estimate of Earth’s global average surface temperature within 10% as accurate as IPCC’s best effort using temperature measurements themselves. Estimating Earth’s temperature from the Sun is to that extent as good as representing Earth’s temperature by smoothing actual thermometer readings. Moreover, to the extent that man might be influencing Earth’s temperature, the effect would lie within that 10% not taken into account by the models, at most one eighth the effect of the Sun. Any reasonable model for Earth’s climate must take variability in solar radiation into account before considering possible human effects.”
Hi Ecoeng,
sorry, but this paper of Jeff Glassman is no science.
It is not very surprising to find something that fits well to the temperature record over a certain time frame by simply playing around long enough with some other data.
Btw, my comment regarding the Eschenbach method does not mean that the temperature data are fully explainable by the sun’s activity or something else.
I just want to emphasize that his formula refers meanly on the data itself (= autoregressive approach).
BLouis79 says:
The amount of greenhouse effect due to clouds vs. that due to greenhouse gases is quite well-characterized. In particular, note that the net effect of clouds is cooling (i.e., their blocking of solar radiation is a somewhat greater effect overall than their effect on outgoing longwave radiation). And, if you compute the temperature necessary for an earth with an albedo due only to its surface, you still get a value below the actual surface temperature. (In fact, even if you assume the earth is a perfect blackbody, i.e., has zero albedo, you get a temperature below the actual surface temperature.)
Besides which, it would be hard to explain how greenhouse gases magically would not affect the surface temperature. I know that this is desperately what you want to believe because of your ideology, but the science is what it is independent of what you want to believe.
Here in more details what is behind Eschenbach’s approach, he suggested:
T(n+1) = T(n)+lambda x delta F(n+1) / tau + deltaT(n) exp(-1 / tau)
that means:
T(n+1) = T(n) + [T(n) – T(n-1)] exp(-1 / tau) +lambda x delta F(n+1) / tau
T(n+1) = [1+exp(-1 / tau) ] T(n) – exp(-1 / tau) T(n-1) + lambda x delta F(n+1) / tau
This has the form:
T(n+1) = A T(n) + B T(n-1) + C delta F(n+1)
with
A = 1+exp(-1 / tau)
B = -exp(-1 / tau)
C = lambda / tau
Eschenbach got now by optimization of A, (B) and C a correlation factor of 0,995. In this approach A and B are no independent fit parameters due to the chosen formula.
If you now set alternatively C = 0 and optimize A and B independently you will get r = 0,997 in best case. Therefore the autoregressive approach itself without taking forcings into account gives higher correlation.
This demonstrates that Eschenbachs approach is doubtful because it uses internal information of the data instead of describing them by external forcings like climate models do!
@JoelShore
I am perfectly happy for science to reveal what is. At this point, the major factors appear to be solar irradiance and cosmic ray effects on clouds. Cosmic ray effect on clouds is are notably missing from the CCSM3 model input data.
I have not seen any experimental laboratory verification of the postulated “greenhouse” mechanism in action – magnitude of measurable temperature change caused by small amounts of CO2+/- other gases. Perhaps someone can point me to some papers. It is unclear if there is even agreement about what the postulated “greenhouse” mechanism is amongst proponents.
Hi Blouis,
your questions have simple answers:
“Cosmic ray effect on clouds is are notably missing from the CCSM3 model input data.”
Clear, because nobody knows the physical mechanism and its strength.
“I have not seen any experimental laboratory verification of the postulated “greenhouse” mechanism in action ”
Obviously, the greenhouse effect of the atmospheres needs an atmosphaere with the most essential features in nature like vertikal temperature gradient, troposphaere ozone layer, solar irradiance and absorption and emission of thermal and solar radiation through kilometers of a vertical air column in the lab.
Nico Baecker
“This demonstrates that Eschenbachs approach is doubtful because it uses internal information of the data instead of describing them by external forcings like climate models do!”
Deeply contradictory isn’t it that Nico Baecker can ‘dismiss’ Dr. Jeff Glassman’s work relating the ‘official’ IPCC AR4 Wang et al (2005) curve for the Sun’s activity (TSI) to the same IPCC AR4 ‘official’ HadCRUT3 temperature record since 1850 on the spurious grounds that it is ‘no science’.
Obviously Baecker is unable to even estimate the odds against this and unable to even admit that solar activity is a forcing.
Furthermore, in his naivety Baecker is obviously unaware that Glassman is a respected retired scientist (former head Hughes Aerospace Science Division) who was one of the 20th century’s leading experts on signal deconvolution techniques.
In fact, if Nico Baecker has, and uses, a cell phone he is unwittingly using very day one of the very mathematical techniques Glassman developed!
The fact is that Glassman used a mainstream deconvolution technique to identify the imprint of the Wang et al (2005) TSI plot is not ‘no science’. It is good, standard science.
In addition it uses exactly an ‘external forcing’ Nico asked for!!!!
If anything, Glassman’s approach has provided good evidence that there probably is some of amplification of the solar forcing signal going on. A significant number of recent mainstream literature papers on whether there is a floor or not in the heliospheric field (despite Svalgaard saying yes, and setting it higher than it probably is even if there is one) are highly relevant to this question.
Then of course there are the now well reviewed 10Be, 14C and tree ring records of the last 9300 years (e.g. Steinhilber et al). How to explain the solar activity and temperature record of the last 9300 years, in the absence of any evidence CO2 forcing, if there isn’t any amplificvation of the solar signal occurring. In a word, you can’t!
Clearly there are no simple answers even now and if there were Nico Baecker certainly doesn’t have them.
Ecoeng says:
Great…So apparently you are? Please tell us what those odds are and how you calculated them.
There seems to be little evidence to verify this claim on the web. By what standard is he one of the leading experts…Did he publish a book on it, did he publish papers on it, is he on some path-breaking patents regarding it? (I couldn’t find any patents by a Glassman with the assignee being Hughes in the USPTO database.)
Frankly, from looking at his website, he appears to be pretty much of a crackpot who doesn’t even accept that humans are responsible for the rise of CO2 levels in the atmosphere…That puts him pretty far out there.
Joel Shore says:
“…humans are responsible for the rise of CO2 levels in the atmosphere…”
Classic misdirection [“Look over there! A kitten!”]
The real question is: what empirical, testable evidence is there that definitively shows that CO2 is harmful?
Joel Shore always avoids providing any evidence of global damage due to CO2, simply because there is no such evidence. Not to be too hard on Joel; no one else has any evidence of global harm from CO2, either.
Hi Ecoeng,
keep serious. I have no interest to deal with a paper which provides at a first glance so many crude statements and is not published in a scientific magazine including review by experts. I’m interested in scientific findings and not in private opinions.
To get the interest of scientfic community the author has to answer in my option at minimum the following obvious questions:
– what is the physical process behind or what could be?
– why is this process selecting these special periods of 134 and 46 years and this special TSI derivative not anything else?
– can artifacts by the data treatment cause these findings? Note, that the used series spans over 160 years and that the length of the moving average of 134 years is nearby. Every communications engineer with experience in signal analysis is aware on the influence of limited time frame on the spectral signature of a signal, would extensively investigate on potential trouble with that and would present these results. Thus, I miss clear and convincing arguments to exclude that the findings are rubbish at the end.
That’s all I have to say about this paper. I would appreciate if we come back to the original topic of this blog regarding Eschenbach’s funny formula.
Smokey says:
It is not misdirection when we are not addressing what Smokey wants to talk about in every thread. What we are trying to establish here is whether some guy (Jeffrey Glassman) who posts some un-peer-reviewed stuff up on the internet looks like he is any sort of credible source or whether he says things that every reasonable person knows is nonsense.
As always, Joel Shore avoids posting evidence, per the scientific method, showing global damage from CO2. As I stated, Joel’s comments are classic misdirection: instead of saying, “Look, a kitten!”, Joel Shore says, “Look, Dr Glassman!”
Produce measurable, verifiable evidence of global harm from CO2 — or admit that CO2 is harmless.
Cosmic ray effects on clouds appear to be backed by solid science. Shaviv appears to have data to enable him to correct for this effect.
http://www.phys.huji.ac.il/%7Eshaviv/articles/2004JA010866.pdf
It would be interesting to see Willis’ model run with just solar forcing and cosmic ray effect.
Hi BLouis79 ,
interesting, but as explained above, there is less space left to improve the correlation factor of Willis’ model by adding more forcings. Because the main portion of the goodness of fit is not due to physical driving causes at all.
Joel Shore
“There seems to be little evidence to verify this claim on the web. By what standard is he one of the leading experts…Did he publish a book on it, did he publish papers on it, is he on some path-breaking patents regarding it? (I couldn’t find any patents by a Glassman with the assignee being Hughes in the USPTO database.)
Frankly, from looking at his website, he appears to be pretty much of a crackpot who doesn’t even accept that humans are responsible for the rise of CO2 levels in the atmosphere…That puts him pretty far out there.”
Dr. Jeffrey A. Glassman is one of the giants of 20th century signal processing. He is the (sole) author (1970) of one of the highest citation index papers in the history of FFTs (Fast Fourier Transforms) and his general N point FFT is still the fastest algorithm available for any length input series – hence it’s widespread use in signal processing. There is no need to go into the many other highpoints of Glassman’s outstanding career in applied avionics etc.
Childish utterings by types like Joel Shore such as the one quoted above are standalone proofs of what happens when babes would enter the woods……meaning nothing much at all except to signal (pun intended) the quiet end of any decent blog thread.
Well, I am glad to hear that Glassman was a giant in signal processing back in his day of 1970. I guess I have probably used some version of his FFT algorithms without knowing that he was the inventor.
However, his uttering in regards to climate science seem to be junk…which is of course why the only place they are “published” is on the web.