One Step Forward, Two Steps Back

Guest Post by Willis Eschenbach

David Rose has posted this , from the unreleased IPCC Fifth Assessment Report (AR5):

‘ECS is likely in the range 1.5C to 4.5C… The lower limit of the assessed likely range is thus less than the 2C in the [2007 report], reflecting the evidence from new studies.’ SOURCE

I cracked up when I read that … despite the IPCC’s claim of even greater certainty, it’s a step backwards.

You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.

However, that was reduced in the Fourth Assessment Report, to a narrower, presumably more accurate range of from 2°C to 4.5°C. Now, however, they’ve backed away from that, and retreated to their previous estimate.

Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.

And the estimates of climate sensitivity have not gotten even the slightest bit more accurate.

Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.

So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?

I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.

And what is the wrong path?

The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:

∆T = lambda ∆F

where T is temperature, F is forcing, lambda is the climate sensitivity, and ∆ means “the change in”.

I have shown, in a variety of posts, that the temperature of the earth is not a function of the change in forcings. Instead, the climate is a governed system. As an example of another governed system, consider a car. In general, other things being equal, we can say that the change in speed of a car is a linear function of the change in the amount of gas. Mathematically, this would be:

∆S = lambda ∆G

where S is speed, G is gas, and lambda is the coefficient relating the two.

But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.

Note that this is NOT a feedback, which would just change the coefficient “lambda” giving the linear relationship between the change in speed ∆S and the change in gas ∆G. The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.

The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Nina warm water pump, and the PDO. And as a result, the change in global temperature is totally decoupled from the changes in forcings. This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.

To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.

w.

PS—I have also shown that despite their huge complexity, the global temperature output of the models can be emulated to a 98% accuracy by a simple one-line equation. This means that their estimate of the “climate sensitivity” is entirely a function of their choice of forcings … meaning, of course, that even on a good day with a following wind they can tell us nothing about the climate sensitivity.

 

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

317 Comments
Inline Feedbacks
View all comments
Chuck Nolan
September 15, 2013 8:02 am

∆S = lambda ∆G
where S is speed, G is gas, and lambda is the coefficient relating the two.
————————————————
The problem is they do not know the value nor the makeup of Lambda.
In the car example everything from wind, tire pressure, altitude, temperature, humidity,
up and down hills, road friction and wet or icy roads etc go into the variable “Lambda.”
In the case of climate the value and make up of Lambda is even less known.
Worse of all, Lambda’s value is dynamic so changes in certain of its components causes changes in some of its other components.
eg
car: increase temp causes increased road friction
and
cagw: increase temp causes increased cloud
And they want to tell me the ∆t to 1/100°
Note, at lease in the car example the system is a little less chaotic, maybe.
cn

September 15, 2013 8:09 am

At last the secrets of environmental computation.
http://en.wikipedia.org/wiki/The_Turk

mbur
September 15, 2013 8:10 am

…sun is the gas tank, water is the engine ,and physics is the switch.
http://en.wikipedia.org/wiki/File:Phase_diagram_of_water.svg
Thanks for the interesting articles and comments.

Matthew R Marler
September 15, 2013 8:21 am

Note that this is NOT a feedback,
The governor is not a feedback? The cruise control certainly is a feedback.

Steve Oregon
September 15, 2013 8:22 am

Willis said
“So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?
I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.”
I’d add that the path was chosen and the scientific pursuit is an attempt to prove the path correct without any regard for or amendment to signs that it is not. It’s even worse if detected errors are deliberately ignored. .
If were on a drive north from Portland to Seattle and I passed the California boarder, Lake Shasta signs, the turn off to Sacraments and all sorts of California road signs I should conclude I’m on the wrong path.
But if my entire career, credibility and agenda were dependent upon my trip to Seattle I’d eventually run out of BS.

Norm
September 15, 2013 8:25 am

Why is the IPCC/Gore/Obama/Hansen pushing to reduce CO2 if the problem is (supposedly) forcing? Why aren’t they ranking the various forcings and going after those? Their approach is 100% ass backwards, CO2 isn’t the problem.

September 15, 2013 8:26 am

We haven’t cured cancer or the common cold, we haven’t made a fuel better than gasoline, people still get Alzheimer’s, I still don’t have a flying car, there are many many fields of human endeavour where little progress has been made in a lot longer than 30 years. Climate “science” proves nothing, predicts nothing that occurs, establishes no new principles, and cannot prove or disprove its central hypothesis that CO2 alters the climate.
Some science, what would we do without them….

Pamela Gray
September 15, 2013 8:33 am

Willis, is your post a model or an equation? A model uses equations to search for the way in which something actually works now, in the past, and in the future. Does your equation function in all these ways and how good is it at hindcasting? Run the model as it currently is stated and tell us what direction you think growing conditions are heading in say 20 years. Run it backwards and tell us that it could have projected the awful freezing temperature in the 50’s and 70’s. Can’t? Will you have to add something to your model to do that or is it ready to go? Would your model have worked to warn farmers of the dustbowl? Would it have worked to encourage preparations for more diversity in farming production during the medieval warm period?
Regarding models, many people also panned efforts to go to the moon, saying it couldn’t be done. One or two scientists trying to find ways to do that privately thought it could not be done. Yet it was done, and they used models to come up with the best chance of being successful. Several of those models probably would not have worked. But the fact that they used models was wrong? And get this: They were using models to predict something that hadn’t happened yet and in which there was a good chance that someone was going to die! Now that was courageous! Were they foolish?
I think it is possible to eventually get these models right but it will take a long time. Why? This is a “duh” moment: We have to compare the model to the rather noisy observation and it takes a while to see whether or not there is a match. You are complaining about something that could not have been avoided. That’s right. They could not have avoided this length of time. Say What? You wanted them to have been correct within the first 5 years? How could that have been possible?
You must be part of the now generation. Here is a model of someone who wants what they want now and if they don’t get it, someone is wrong. You want someone to project what you want for breakfast. You want them to be right the first time. And you want it served at 6:00 AM sharp. If it doesn’t happen, the breakfast provider is wrong and they should lose their job. hmmm. Maybe the world should have fired Edison early in his career. And all the people that came before and after him regarding several model and subsequent invention improvements we now enjoy today. They took too long and wasted too much money.
But in reality, Science just seems to work this way. It takes too long and wastes too much money before they get it right. Yes it makes us mad and we want to throw the bums out. Good thing we weren’t successful in the past.

Bill Illis
September 15, 2013 8:39 am

The average of UAH and RSS are now lower than even the lowest climate model predictions from IPCC AR4. The lowest models have an in-built assumption of 2.2C per doubling (versus 3.0C in the average) and the lower troposphere is supposed to warm at 1.3 times the surface. So that should tell you something.
http://s24.postimg.org/uk2g52uol/IPCC_AR4_vs_HCrt4_UAH_RSS_Aug_2013.png

mrmethane
September 15, 2013 8:41 am

Feedback? Cruise control USES feedback to maintain speed in response to increases and decreases in measured speed about a SET-POINT. The nature of the feedback LOOP may or may not include delays to effect smoothing, hysteresis (“dead-band”), ramping, anticipation and other elements to enhance effectiveness in concert with comfort. Feedback is what happens INSIDE the control system, which has inputs and outputs, the outputs being RESULTS of the feedback “transfer functons” inside. A human driver adjusts the vehicle throttle (output) as a result of becoming aware of a speed “error signal”, with the feedback being part of the mental process used to decide on a throttle adjustment. May be a subtle point but important. In a simple cloud system, internal POSITIVE feedback would result in reduced cloud cover in response to rising temp, while NEG would effect an INCREASE cloud cover when the temps rise.
Feedback is part of the “transfer function”.

Scott Basinger
September 15, 2013 9:00 am

Pamela Grey writes: “But in reality, Science just seems to work this way. It takes too long and wastes too much money before they get it right. Yes it makes us mad and we want to throw the bums out. Good thing we weren’t successful in the past.”
The fact that there are refinements and discoveries yet to be made in order to have models which are less wrong doesn’t bother me all that much. What does bother me is that many of the scientists involved are playing a game. The game is ‘sell fear’ andmany who should honestly know better have beat the drum of consensus in order to keep the money coming.
I suspect that many in the field who have personally beat this drum are painfully aware that their models have been terrible at prediction and that many physical processes that they have assumed to be a certain way may in fact be the opposite. Those voices they’ve tried to squelch and badmouth (even as they had to redefine what the peer reviewed literature was) such as Roy Spencer may have been right all along about water vapour feedback. I bet it irks them that someone they have tried to paint as a creationist relgious zealot turns out to be a more adept seeker of truth than those of the True Consensus Faith.
As for Willis, the criticism over your equation is valid. You would do well to admit it lest you appear as inflexible and incapable of change as your oft unworthy opponents.

September 15, 2013 9:23 am

richardscourtney says: September 15, 2013 at 7:17 am

So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.
Excellent comment, thank you Richard.
Parties interested in the fabrication of aerosol data to force-hindcast climate models (in order for the models to force-fit the cooling from ~1940 to ~1975, in order to compensate for the models’ highly excessive estimates of ECS (sensitivity)) may find this 2006 conversation with D.V. Hoyt of interest:
http://www.climateaudit.org/?p=755
Douglas Hoyt, responding to Allan MacRae:
“July 22nd, 2006 at 5:37 am
Measurements of aerosols did not begin in the 1970s. There were measurements before then, but not so well organized. However, there were a number of pyrheliometric measurements made and it is possible to extract aerosol information from them by the method described in:
Hoyt, D. V., 1979. The apparent atmospheric transmission using the pyrheliometric ratioing techniques. Appl. Optics, 18, 2530-2531.
The pyrheliometric ratioing technique is very insensitive to any changes in calibration of the instruments and very sensitive to aerosol changes.
Here are three papers using the technique:
Hoyt, D. V. and C. Frohlich, 1983. Atmospheric transmission at Davos, Switzerland, 1909-1979. Climatic Change, 5, 61-72.
Hoyt, D. V., C. P. Turner, and R. D. Evans, 1980. Trends in atmospheric transmission at three locations in the United States from 1940 to 1977. Mon. Wea. Rev., 108, 1430-1439.
Hoyt, D. V., 1979. Pyrheliometric and circumsolar sky radiation measurements by the Smithsonian Astrophysical Observatory from 1923 to 1954. Tellus, 31, 217-229.
In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly. There are other studies from Belgium, Ireland, and Hawaii that reach the same conclusions. It is significant that Davos shows no trend whereas the IPCC models show it in the area where the greatest changes in aerosols were occurring.
There are earlier aerosol studies by Hand and in other in Monthly Weather Review going back to the 1880s and these studies also show no trends.
So when MacRae (#321) says: “I suspect that both the climate computer models and the input assumptions are not only inadequate, but in some cases key data is completely fabricated – for example, the alleged aerosol data that forces models to show cooling from ~1940 to ~1975. Isn’t it true that there was little or no quality aerosol data collected during 1940-1975, and the modelers simply invented data to force their models to history-match; then they claimed that their models actually reproduced past climate change quite well; and then they claimed they could therefore understand climate systems well enough to confidently predict future catastrophic warming?”, he close to the truth.”
_____________________________________________________________________
Douglas Hoyt:
July 22nd, 2006 at 10:37 am
MacRae:
Re #328 “Are you the same D.V. Hoyt who wrote the three referenced papers?”
Hoyt: Yes
.
MacRae: “Can you please briefly describe the pyrheliometric technique, and how the historic data samples are obtained?”
Hoyt:
“The technique uses pyrheliometers to look at the sun on clear days. Measurements are made at air mass 5, 4, 3, and 2. The ratios 4/5, 3/4, and 2/3 are found and averaged. The number gives a relative measure of atmospheric transmission and is insensitive to water vapor amount, ozone, solar extraterrestrial irradiance changes, etc. It is also insensitive to any changes in the calibration of the instruments. The ratioing minimizes the spurious responses leaving only the responses to aerosols.
I have data for about 30 locations worldwide going back to the turn of the century.
Preliminary analysis shows no trend anywhere, except maybe Japan.
There is no funding to do complete checks.”

John Norris
September 15, 2013 9:25 am

re: “… after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest.”
I’m betting we passed “millions of dollars” somewhere back in the 80’s.
Another item where the count has gone up is the number of models and modelers. Your not a global team contributor if your country’s scientists don’t have a model input for the IPCC. Of course this further supports the theme of your post. Despite the increased capacity and effort, there is no apparent improvement in our ability to predict the future. Their simulations all overshot reality.

September 15, 2013 9:27 am

Friends:
I write in hope of providing some clarity to the issues of modelling introduced to this thread by Pamela Gray at September 15, 2013 at 8:33 am and discussed by Willis Eschenbach in a series of subsequent posts.
First, I state some basic modelling principles.
A model is a simplified representation of reality.
Being simplified, no model is an exact emulation of reality; i.e. no model is a perfect and no model is intended to be perfect.
A model is constructed for a purpose.
For example, a model of heat loss from a cow may assume that a cow is shaped as a sphere with the surface area of a real cow. And this simple model may provide an adequate quantitative indication of how heat loss from a cow varies with the cow’s metabolic rate. Thus, this hypothetical model may be very useful.
But that model of a cow cannot be used to indicate the movements of a cow. A model of a cow which includes legs is needed for that.
Another model of a cow may be constructed purely for the pleasure of the modeller. In this case it may be carved from wood and painted.
Possible purposes for models are infinite.
A model may have many forms.
It may be physical, abstract, algebraic, numeric, pictorial or an idea. If its form fulfils the desired usefulness then it is an appropriate model; i.e. it can fulfil its purpose.
In the context of the discussion between Pamela and Willis, there are two questions of importance because these questions are ALWAYS important when considering any model.
Question 1. What is the purpose of the model?
Willis explains this when he says

My model is a very simple model, which emulates the global temperature results of the climate models with extremely high fidelity.
{snip}
But then, what would you expect from a model (either mine or a GCM) which merely outputs a lagged, resized version of the inputs?

In other words, the purpose of his model is to determine the form of GCM outputs.
Question 2. Does the model fulfil its intended purpose?
Again, Willis explains this saying

My model is a very simple model, which emulates the global temperature results of the climate models with extremely high fidelity. Regarding the global temperature, it can do everything that the climate models can do … which, as my model shows, is nothing.
But then, what would you expect from a model (either mine or a GCM) which merely outputs a lagged, resized version of the inputs?

In other words, his model has fulfilled its purpose by determining the form of GCM outputs is “lagged, resized version of the inputs” and, thus, has demonstrated the GCMs output “nothing” of value.
Summary
Pamela seems to consider all models as being numerical models which are required to hindcast and to forecast. Of course, GCMs are numerical models with the purpose of hindcasting and forecasting climate. But Willis’ model is mathematical (n.b. not numerical) and was constructed to assess whether GCMs fulfil their purpose, and it shows they don’t.
Richard

Pamela Gray
September 15, 2013 9:29 am

Willis, I simply used a vignette to reveal a type of response similar to those portrayed in comments and the tone of your post. Vignettes are useful in that way. Which is, they got it wrong, throw the bums out. And there is history, also a useful endeavor, that helps us examine the process science has taken when they got it wrong, and wrong, and wrong again. Which together tells us that there are other possibly more fruitful ways to respond to the current state of climate science modeling. Modeling will not go away. So maybe we need to focus on reasoned and plausible ways to improve the modeling.

David Riser
September 15, 2013 9:30 am

Thanks Willis,
Genius at work again. I would add that the governor works better than most folks will admit. The error bars on the historical temp record are such that there hasn’t been any provable significant long term warming over the entire record. There is some legitimate concern that the temp record has been fiddled with over the course of the last 30 or so years to make a case for warming. It is surprising that there is such consistency across a temp record that is created by taking the median value of monthly median values of annual median values and creating a trend through statistics.
As for physics not making any progress over 30 years…There has been a steady improvement in understanding of small processes which has filtered steadily to the practical application of those processes. So for instance every time someone says we can not do x we end up a few years later able to do it. One area this shows through is in the area of electronics, things continue to get smaller even after we thought that capacitance caused resistance would stop the shrink. Our understanding of these processes has improved.
As for no flying cars… that is not because we can’t make a car that flies, its because a flying car is a regulation nightmare. A flying car has been a reality for a very long time, jet engines are small enough now to fit in a motorcycle, this gives more than enough power to put a car in the air. Once you get more than a single car up though you have the possibility of a pretty horrendous accident.
v/r,
David Riser

David Riser
September 15, 2013 9:36 am

Richard,
One thing to add about General Circulation Models (GCM), they were not originally designed to model climate. They are weather forecasting tools, they do this well over a period of about 48 hours and not so well over 3 – 10 days. So the idea they will accurately predict climate is absurd. In order to do that they would need to be many orders of magnitude larger, i.e. smaller grid sizes, and they would have to have a better understanding of atmospheric physics.
v/r,
David Riser

Bloke down the pub
September 15, 2013 9:42 am

Willis Eschenbach says:
September 15, 2013 at 6:14 am
Bloke down the pub says:
September 15, 2013 at 4:19 am
But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.
~~~~~
I don’t think this is a suitable example Willis, as the cruise control controls the speed by controlling the flow of petrol. Admittedly an automatic gearbox would also play a part but that’s a separate issue.
~~~~~
To the contrary, it is an excellent example, because the the clouds control the incoming energy based on the temperature, just as cruise control regulates the incoming gas based on the speed.
++++++++++++++
Willis, I agree with your last statement but that doesn’t seem to tally with the part of your original post that I’ve highlighted.

September 15, 2013 9:43 am

Friends:
I write to draw attention to the post of Allan MacRae at September 15, 2013 at 9:23 am.
It is VERY important, especially its concluding paragraph.
When you know that each GCM is fudged by a completely arbitrary – and unique value for each GCM – of an assumed aerosol forcing then you know WHY Willis’ model of GCM output shows the models are worthless.
Richard

Pamela Gray
September 15, 2013 9:47 am

Richard, I do not consider all models to be numerical. I have been a follower of statistical and dynamical ENSO models for quite some time. I am arm-chair hobby familiar with both numerical (input historical data) and dynamical (use mathematical equations to simulate dynamically the processes we think are happening) ENSO models. I am also arm-chair hobby familiar with climate models and how various scientists use them by driving them with various tunable and data inputs. Right now I am attempting to fine-tune your model of me by creating a hybrid suggestion. It oughta work better.