One Step Forward, Two Steps Back

Guest Post by Willis Eschenbach

David Rose has posted this , from the unreleased IPCC Fifth Assessment Report (AR5):

‘ECS is likely in the range 1.5C to 4.5C… The lower limit of the assessed likely range is thus less than the 2C in the [2007 report], reflecting the evidence from new studies.’ SOURCE

I cracked up when I read that … despite the IPCC’s claim of even greater certainty, it’s a step backwards.

You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.

However, that was reduced in the Fourth Assessment Report, to a narrower, presumably more accurate range of from 2°C to 4.5°C. Now, however, they’ve backed away from that, and retreated to their previous estimate.

Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.

And the estimates of climate sensitivity have not gotten even the slightest bit more accurate.

Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.

So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?

I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.

And what is the wrong path?

The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:

∆T = lambda ∆F

where T is temperature, F is forcing, lambda is the climate sensitivity, and ∆ means “the change in”.

I have shown, in a variety of posts, that the temperature of the earth is not a function of the change in forcings. Instead, the climate is a governed system. As an example of another governed system, consider a car. In general, other things being equal, we can say that the change in speed of a car is a linear function of the change in the amount of gas. Mathematically, this would be:

∆S = lambda ∆G

where S is speed, G is gas, and lambda is the coefficient relating the two.

But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.

Note that this is NOT a feedback, which would just change the coefficient “lambda” giving the linear relationship between the change in speed ∆S and the change in gas ∆G. The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.

The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Nina warm water pump, and the PDO. And as a result, the change in global temperature is totally decoupled from the changes in forcings. This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.

To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.

w.

PS—I have also shown that despite their huge complexity, the global temperature output of the models can be emulated to a 98% accuracy by a simple one-line equation. This means that their estimate of the “climate sensitivity” is entirely a function of their choice of forcings … meaning, of course, that even on a good day with a following wind they can tell us nothing about the climate sensitivity.

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

317 Comments
Inline Feedbacks
View all comments
MrX
September 15, 2013 11:33 am

Nick Stokes says:
September 15, 2013 at 4:49 am
But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.
————————–
Assuming you’re correct, man made CO2 has a 5 year half life. So after 5 years, you’d have about 120 gT of CO2 in the atmosphere assuming a linear half life which it probably isn’t. This amounts to 4% of the total CO2 in the atmosphere as is being reported. But the amount won’t get higher than that even if we continue pumping CO2 into the atmosphere at 30gT per year because of the 5 year half life. It will stay identical. Rising production by humans won’t make it budge more than about 1.5 gT total forever per gT increase per year.
Yet the total CO2 is still climbing. You need CO2 that has an average halflife of 30 years for this to happen. So the rising CO2 is already known to not be from human sources.

RC Saumarez
September 15, 2013 11:33 am

At Richard Courtney,
I am well aware of parameterisation in mathematical models. You have an equation which is an abstraction of a physical process. This is parameterised. I have done this many times in a different type of model.
For, example, The NCAR Cam 3 model contains, among other things, a thermodynamic model of sea ice. This is based on physical assumptions. (They may not be correct). This allows the formation of sea ice to be predicted through defined physical mechanisms.
This is a completely different intellectual process to Eschenbach’s difference equation.

Pamela Gray
September 15, 2013 11:38 am

Richard, I have pointed out that AR4 describes them as mathematical. They attempt to dynamically reproduce climate processes using code. They use the term “mathematical” to describe the code. But you describe them as numerical. Why do you think the code is numerical? As I said, there appears to be either a difference in understanding of the code, the terms or an equivocal use of the two terms. So what do you mean when you use the term “numerical”? What we may be having is a discussion of terms that neither of us have used correctly or that AR4 has used correctly. I don’t know. You tell me. That is why I asked for a clarification. There appears to be a difference here and I wanted to know why and your reasoning.
From what I have read about the ENSO models, the dynamical models certainly use mathematical code strings to represent physics based climate processes and connections as they understand them to be.

Latitude
September 15, 2013 11:40 am

Lars P says:
September 15, 2013 at 11:24 am
========
100%…
They diddled with the past temp record to make it more scary than it was…
….secondary to trying to create an accurate enough temp record from trees, pollen, and ice etc
a 1/2 degree
No wonder all the computer games showed a higher increase in temps than what really happened..
…they can’t even get a flat trend line correct
The computer games will never be, can never be…and are all doomed laughing stock failure
…until they admit they’ve lied about the historical temp record
What are the odds of that happening?

Latitude
September 15, 2013 11:41 am

Assuming you’re correct, man made CO2 has a 5 year half life….
absolute garbage

RC Saumarez
September 15, 2013 11:42 am

At Willis Eschenbach.
Thank you for your rather inteperate reply.
I have in the past been forced to write a post on WUWT to correct the complete rubbish you wrote about signal processing – a subject that you have not studied in any depth.
I am well aware of the nature of a “black box” model, although those of us who are more familiar with the subject might regard it as a set of transfer functions, describing functions or differential equations.
What you clearly fail to recognise is that when ypu write a differential equation, you are making a statement about the physical structure of the system in questions. This brings into question observaibity and behaviour under a generalised set of variations.
I can see that I will have to write another post to educate you on basic systems analysis.

September 15, 2013 11:43 am

I don’t see governing factors invalidating ∆T = lambda ∆F. Instead, I see extra negative feedbacks reducing the value of lambda, or causing it to decrease as ∆F increases. Even Dr. Roy Spencer, a big name scientist on the skeptical side, sees validity of ∆T = lambda ∆F. One thing he has done is attempting determinations/estimates of lambda, and comes up with lower values than IPCC favors.
One negative feedback that I see increasing as greenhouse gas concentration increases is the lapse rate feedback. A positive feedback that I see decreasing as the world warms is the surface albedo feedback – seasonal snow and ice will retreat to places and times-of-year where/when there is less sunlight affected by further change of snow and ice, or in an extreme case be largely gone.
I do see how peak tropical ocean surface temperatures are largely regulated by factors reducing the climate sensitivity there, such as big blowups of convection from ocean getting hot to the (greenhouse-gas-cooled) top of the troposphere. I don’t agree with completely governed. For example, if solar output makes a major long term change, or the distance between Earth and the sun changes, then peak tropical ocean surface temperature will change.

Pamela Gray
September 15, 2013 11:56 am

Here is a simple explanation of dynamical mathematical models versus statistical models. Most if not all GCM models are of the dynamical kind, meaning they would be considered to be mathematical.
http://iri.columbia.edu/climate/ENSO/background/prediction.html#types

RC Saumarez
September 15, 2013 11:57 am

Courtney(2)
I find your comments as regards mathematical models and “numerical models” absolutely extraordinary.
Numerical models are used to solve a set of equations that cannot be solved analytically. If you look at NCAR Cam 3 for example, there are numerous physical assumptions that are cast in a differential form (usually). Since there isn’t a hope of solving these analytically, the equations are approximated numerically.
There is a long discussion about the framework in which the equations describing physical processes such as diffusion, advection in different coordinates, evaporation etc., are solved numerically to give a stable solution (hopefully) at defined time steps.
Let me give you a simple(r) example. I solve equations governing cardiac propagation. This is theoretically a 3dimensional cable equation. However, the ionic currents are dominated by highly non-linear equations that depend on local potential. I can write the basic 3d equations for this but there is no prospect of solving them analytically. This can be done through standard techniques.
Note. The model is physical. Diffusion, charge flow and capacitance are assumed from experimental data and basic theory. The ionic current flow equations are semi-empirical, based on observation and some speculation about how they are controlled. The solution is numerical, which is based on a large body of maths of how you approximate differential equations and solve them iteriively.
I think Pamela Gray is dead right and you and Eschenbach are wrong.

September 15, 2013 11:59 am

RC Saumarez:
Thankyou for your reply to me at September 15, 2013 at 11:33 am.
I copy it here in full to save others needing to find it.

I am well aware of parameterisation in mathematical models. You have an equation which is an abstraction of a physical process. This is parameterised. I have done this many times in a different type of model.
For, example, The NCAR Cam 3 model contains, among other things, a thermodynamic model of sea ice. This is based on physical assumptions. (They may not be correct). This allows the formation of sea ice to be predicted through defined physical mechanisms.
This is a completely different intellectual process to Eschenbach’s difference equation.

Yes, it IS “a completely different intellectual process to Eschenbach’s difference equation”.
But that was never in dispute.
You claimed in your post at September 15, 2013 at 10:24 am you say

The point of a GCM is that it encapsulates mechanisms. If A happens the B will follow that will precipitate C and so on. This means that in such a model parameters that relate to details of climate can be examined for their effects in the future (Note: I am not saying that they do this with any particular accuracy).

I replied saying at September 15, 2013 at 10:34 am

You think a GCM “encapsulates mechanisms”?
If you really think that then you have been duped. All, yes, ALL the major climate mechanisms are fudged or contain parametrisations in the GCMs.

YOUR RESPONSE IS TO CHANGE THE SUBJECT!
And a parametrisation is a guess. It may be an educated guess, but it is only a guess. You admit this when you say,
“This is based on physical assumptions. (They may not be correct).”
Willis’ model does not include any guesses.
So, as you say, parametrised GCMs use a completely different intellectual process to Eschenbach’s difference equation. The difference is that “to Eschenbach’s difference equation” is not based on guesses.
Richard

September 15, 2013 12:00 pm

Joe Dunfee says:
September 15, 2013 at 9:55 am
The video ad started playing automatically. When an ad starts playing audio without you clicking on it, it is REALLY annoying. Please see if there is any way to stop this from happening.

I haven’t seen an ad, much less a (commercial) video, on this (or almost any other site) for years.
1. Use your system settings to turn off Autoplay.
2. If using a browser that supports Add-ons, use AdBlock Plus or similar. I use FireFox, partly for the wealth of Addons it supports. I also highly recommend its TabMix and Lazarus addons. And NoScript. And UnMHT. And Download Helper. And Ghostery.

Pamela Gray
September 15, 2013 12:07 pm

These links provide information on a piece of the code in climate models used to calculate radiation. The first link describes early efforts on a string of code used to calculate the radiation process within a climate model. The second link describes the current efforts to work on this section of code. The code appears to me to be mathematical calculations of a climate dynamic.
http://www.gfdl.noaa.gov/bibliography/related_files/rge9101.pdf
http://onlinelibrary.wiley.com/doi/10.1029/90JD01618/abstract

September 15, 2013 12:07 pm

RC Saumarez:
Your post at September 15, 2013 at 11:57 am begins saying

Courtney(2)
I find your comments as regards mathematical models and “numerical models” absolutely extraordinary.

And I find your posts both offensive and completely ignorant of the subject.
You make vague assertions interspersed with blatant errors. And when those errors are pointed out you change the subject. See my answer to you at September 15, 2013 at 11:59 am.
Post something sensible or choose to clear off.
Richard

Pamela Gray
September 15, 2013 12:09 pm

Hey Joe! The same thing has been happening to me! Twice this morning! It just started happening today on this thread. Weird.

bones
September 15, 2013 12:20 pm

You asked: “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
String theory would have to be at the top of the list. Like CAGW theory, to its adherents it is unfalsifiable.

Pamela Gray
September 15, 2013 12:20 pm

RC Saumarez I must ask, is your work related to the hunt for how to fix a heart without replacing it, and/or how to create a heart that is better at being a normal heart?

September 15, 2013 12:21 pm

Pamela Gray:
I am replying to your ridiculous post at September 15, 2013 at 11:38 am.
1.
You said IPCC AR4 Chapter 8 disagreed with my and Willis’ description of GCMs and set me the homework off reading the entire chapter to try to find that difference.
2.
I said you needed to tell me what the difference is.
3.
Your post I am answering says to me

They use the term “mathematical” to describe the code. But you describe them as numerical. Why do you think the code is numerical?

That’s it? One word? And you wanted me to search the entire chapter to find it!
Pamela Gray, I don’t know what you are doing in this thread, but it does not seem to be constructive.
Firstly, I did NOT talk about the “code”: I did not mention it.
I talked about the models.
The GCMs use finite difference analysis to iterate to a stable solution. That is a numerical model obtaining a numerical solution. The fact that the models are coded with mathematics does not change that.
Richard

Pamela Gray
September 15, 2013 12:25 pm

And I must add, you are a busy man/woman! Is your background primarily in math, medical science, or both? I know of research teams who specifically include mathematicians because of the research being done creating/with models. So I say again, you have been busy!

September 15, 2013 12:33 pm

It’s been my understanding that a “governor” as used in motor vehicles is meant to cap the top speed, not to maintain a consistent, user-defined speed. Maybe it’s just a difference in usage.

September 15, 2013 12:33 pm

“You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.”
Actually there is a longer history of the number than that. If you want a real fun lesson in the history of science look at estimates of the speed of light or things like the electron charge.
A good history was started here
http://bartonpaullevenson.com/ClimateSensitivity.html
I would make it a WUWT reference page
One also has to distinguish methods. Most climate scientists dont see models as the best source of estimates. The best evidence is observational..

papertiger
September 15, 2013 12:43 pm

wws says:
September 15, 2013 at 7:12 am
in response to “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
Oh, I can think of quite a few! UFOlogy, Sasquatchology, Phrenology, the Pernicious Evil of Chemtrails, and Who Was on the Grassy Knoll?
Steven Goddard seems pretty convinced that there was somebody on the Grassy Knoll.
Because there’s no way on Earth an ex-Marine could shoot a politician’s pumpkin sized head, travelling at parade speed, from 90 yards away, on a clear day, with no wind, only given three shots. /sarcasm tag
cross tagged / pointing out nominal allies’ stupidities

1 4 5 6 7 8 13