One Step Forward, Two Steps Back

Guest Post by Willis Eschenbach

David Rose has posted this , from the unreleased IPCC Fifth Assessment Report (AR5):

‘ECS is likely in the range 1.5C to 4.5C… The lower limit of the assessed likely range is thus less than the 2C in the [2007 report], reflecting the evidence from new studies.’ SOURCE

I cracked up when I read that … despite the IPCC’s claim of even greater certainty, it’s a step backwards.

ipcc logo

You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.

However, that was reduced in the Fourth Assessment Report, to a narrower, presumably more accurate range of from 2°C to 4.5°C. Now, however, they’ve backed away from that, and retreated to their previous estimate.

Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.

And the estimates of climate sensitivity have not gotten even the slightest bit more accurate.

Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.

So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?

I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.

And what is the wrong path?

The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:

∆T = lambda ∆F

where T is temperature, F is forcing, lambda is the climate sensitivity, and ∆ means “the change in”.

I have shown, in a variety of posts, that the temperature of the earth is not a function of the change in forcings. Instead, the climate is a governed system. As an example of another governed system, consider a car. In general, other things being equal, we can say that the change in speed of a car is a linear function of the change in the amount of gas. Mathematically, this would be:

∆S = lambda ∆G

where S is speed, G is gas, and lambda is the coefficient relating the two.

But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.

Note that this is NOT a feedback, which would just change the coefficient “lambda” giving the linear relationship between the change in speed ∆S and the change in gas ∆G. The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.

The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Nina warm water pump, and the PDO. And as a result, the change in global temperature is totally decoupled from the changes in forcings. This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.

To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.

w.

PS—I have also shown that despite their huge complexity, the global temperature output of the models can be emulated to a 98% accuracy by a simple one-line equation. This means that their estimate of the “climate sensitivity” is entirely a function of their choice of forcings … meaning, of course, that even on a good day with a following wind they can tell us nothing about the climate sensitivity.

0 0 votes
Article Rating
317 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
CodeTech
September 15, 2013 4:08 am

Which is a wordy way of saying,
“They’re Wrong”
or, my favorite mantra,
CO2 DOES NOT DRIVE CLIMATE.
Honestly, Love the way they went back to the original estimate. Even that is ridiculous. Clearly 4.5 is not even in the realm of possible, and 1.5 should probably be the high side. If only climate “science” would learn the meaning of “error bars”.

Nick
September 15, 2013 4:10 am

Why is El-Nino a forcing?
It’s just one of many outputs.

johnmarshall
September 15, 2013 4:16 am

They are so convinced that CO2 is the problem that all else is excluded including the fact that CO2 cannot be the stated problem. Also the K&T (AR4) graphic is so wrong, flat earth, 24/7 sunlight etc., using that to evaluate energy flow takes completely the wrong turn into stupidity and theories that cannot work.

RoyFOMR
September 15, 2013 4:17 am

Your cruise control analogy is excellent.

Bloke down the pub
September 15, 2013 4:19 am

But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.
I don’t think this is a suitable example Willis, as the cruise control controls the speed by controlling the flow of petrol. Admittedly an automatic gearbox would also play a part but that’s a separate issue.

Bob Young
September 15, 2013 4:20 am

I seem to recall Jame Lovelace making the same point in his book ‘GAIA’ that the Earth’s climate is governed by a control system. This is a concept that is second nature to any engineer. A well designed process control system (using proportional, integral and derivative drivers) will tamp down any process upsets and adjust the process to maintain a consistent output from the process. When I read the book in the late 80’s/early 90’s, that one point was so obvious to me. In the absence of a massive process upset (asteroid strike, CME), the Earth will regulate itself with its own control system to maintain a consistent temperature.

September 15, 2013 4:24 am

Thanks Willis, well said..

Stacey
September 15, 2013 4:27 am

The most plausible explanation could be hydrogen sulphide?
As in Bad Egg Climate Scientists.

September 15, 2013 4:27 am

“The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:
Δ T = lambda ΔF
where T is temperature, F is forcing, lambda is the climate sensitivity, and Δ means “the change in”.”

You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?
There’s a proposition that if CO2 is doubled, then one can try to quantify the eventual rise in temperature. That’s very different from your “canonical” equation. And the range that you ridicule is partly a recognition that the relation is indeed complex.

Stacey
September 15, 2013 4:32 am

Sorry bloke down the pub but when on cruise control the amount petrol used will change due to driving conditions?
Wind resistance or assistance, driving up or down hills. Also the road surface?

Stacey
September 15, 2013 4:38 am

Mr Stokes
Please can you help me by either explaining or pointing me in the direction of a paper that demonstrates the amount of industrialisation that would be required to double atmospheric CO2 levels. Along with the timescale?

September 15, 2013 4:41 am

Time to drag out my alternate theory of climate change.
The climate changes when factors affecting the phase changes of water (ignoring Malankovic Cycles) change. Specifically, aerosols, black carbon. organic carbon, GCRs, and perhaps a couple of other things drive climate change over decadal to century scales.
This BTW, is consistent with Willis’s Governor System, as the factors referred to affect the (water phase change) governors.

Julian in Wales
September 15, 2013 4:46 am

“This is why it is so hard to find traces of e.g. solar and volcano forcings in the temperature record. We know that both of those change the forcings … but the temperatures do not change correspondingly.”
Does that mean that you are optimist that the present state of the sun will not produce another cold period like the ones we experienced during the Maunder and Dalton minimums. I fear another cooling period far more than warming, I would like to be optimistic that it will not re-occur in my lifetime. The thought of mass starvation and food shortages during my old age is something I do not want to see.

September 15, 2013 4:47 am

For every complex problem there is an answer that is clear, simple, and wrong. — H.L. Mencken

lemiere jacques
September 15, 2013 4:48 am

well, computer speed or memory doesn’t help much, they are still unable to give an error bar… as a result you just don’t know if a “newer ” simulation with much greater number of cells and so on is closer to reality than an “older” one….It is newer, more complex may be more sexy but…you just don’t know if it is more “real”..
It may be possible estmate error bar can be estimated .but i guess that the complexity of the estimation of error bar is one order of magnitude greater than the complexity of simulation. But it is only a guessing.

Nick Stokes
September 15, 2013 4:49 am

Stacey says: September 15, 2013 at 4:38 am
“Mr Stokes
Please can you help me by either explaining or pointing me in the direction of a paper that demonstrates the amount of industrialisation that would be required to double atmospheric CO2 levels. Along with the timescale?”

That’s hardly relevant to the provenance of this “canonical” equation. But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.

geronimo
September 15, 2013 5:01 am

Willis, the 1.5-4.5C sensitivity first appeared in the Charney report in 1979, it was, as I recollect taken from two computer forecasts, one gave 2C sensitivity and the other, from none other than Jim Hansen, gave a 4C sensitivity. In a moment of scientific genius Charney decided that there was an uncertainty of 0.5C and declared the sensitivity as 3C +/- 1.5C. There it stayed until AR4 when an effort to strike more fear into the public was made by increasing the lower sensitivity to 2C, where it probably would have stayed if we hadn’t recently had a plethora of papers setting 2C as the higher level of sensitivity. You’re right, I’ve often wondered how a climate sensitivity plucked out of thin air had stayed static for 34 years, particularly as you point out, with the massive increase in computing power available to the modellers.
Awaiting your next report from your holiday in the UK with bated breath.

Bloke down the pub
September 15, 2013 5:10 am

Stacey says:
September 15, 2013 at 4:32 am
Sorry bloke down the pub but when on cruise control the amount petrol used will change due to driving conditions?
Wind resistance or assistance, driving up or down hills. Also the road surface?
Yes, just as it does when not on cruise control.

pesadia
September 15, 2013 5:19 am

“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t”
Not much has happened in physics, unless you believe that the god particle has been discovered,
which I don’t

AlecMM
September 15, 2013 5:20 am

The claim of 33 K ghe hence triple climate sensitivity was a mistake in 1981_Hansen_etal.pdf**
This means the positive feedback claim is wrong and CS is no more than1.2 K.
However, correct all the other mistakes and do the real science job, and it’s <0.1 K.
**They implied that by taking out ghs, the surface average temperature would fall to -18 K so the difference from the present 11 K is the the. However no clouds or ice increases SW heating by 43% giving a new surface temperature average of ~4-5 deg C, a ghe of ~11 K.
I expect Hansen to apologise for misleading the World and for the names of the referees who failed to check this bad science to be made public 'Science' journal.

KevinM
September 15, 2013 5:23 am

“Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.”
One would expect the same result if they were right the first time. A million variable monte carlo simulation of the apple and the tree would not have much effect on gravity.

Richard M
September 15, 2013 5:26 am

Even GHGs work as governors in the atmosphere. The entire view that they work like a greenhouse is wrong. They work more like a climate control system. If the planet warms they increase the flow of radiation to space, and if the planet cools this flow is reduced (the basic GHE takes over).

papertiger
September 15, 2013 5:27 am

That’s hardly relevant to the provenance of this “canonical” equation. But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.
About 3000 gigatons? So you don’t really know how much co2 is naturally in the atmosphere to less than four decimal places, even when talking gigatons.
Bet you anything you care to name that the “over 30 gigatons a year” is only a guess as well.

Tom in Florida
September 15, 2013 5:29 am

Nick Stokes says:
September 15, 2013 at 4:49 am
“That’s hardly relevant to the provenance of this “canonical” equation. But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.”
What is the estimated gigatons of the entire atmosphere?

AlecMM
September 15, 2013 5:30 am

The climate models are based on 13 mistakes in the physics, 3 of which are elementary.
I can forgive the Meteorologists because along with Climate Alchemists they are taught incorrect radiation physics.
However, for any other engineer or physicist, taught correct physics, to agree that pyrgeometers output a real energy flux is unprofessional.
This mistake triples heat input but they knock it back by incorrectly claiming Kirchhoff’s Law of Radiation applies at TOA and then use double real low level cloud optical depth so as to come up with ‘agw’ that might be true if they hadn’t got the IR physics wrong.
[Will Happer warned pf that 20 years ago when he refused to lie for Gore.]
The real agw has been Asian aerosols reducing cloud albedo. That has not saturated. The problem here is that Sagan’s aerosol optical physics is wrong and the sign of the effect is reversed and it’s from large droplets, easy to prove by experiment.
All in all this has been a disaster for science.

Rob Dawg
September 15, 2013 5:34 am

“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone?”
Astrology and economics are the best comparisons. From the bumbling of astrology using math and science we eventually got astronomy. My feeling is economics is getting closer to that transition as well. We are getting better through observation and computation at modeling economic behavior. The what. From that the may eventually get the why, the forces that rule the results. It is no coincidence that climatology aligns with economics and why both bristle at the frequent comparisons to astrology.

AlecMM
September 15, 2013 5:42 am

Climate Alchemy is not a science because the models, based on wildly incorrect physics are the same as the old alchemists repeating an experiment time after time in the faint hope that they will find the philosopher’s stone.
It’s lunacy on a gargantuan basis and corrupt politicians are paying them our money to do it.

Scott
September 15, 2013 5:46 am

This is a great analogy, another analogy I’ve been thinking of that might explain the disagreements between the two camps on AGW might be the static versus dynamic view of the revenues to government when a new tax on society is levied. A tax is also a forcing, a forcing on society. The static view on taxation (generally held by liberals) is that when the new tax is enacted people generally behave the same and just pay the tax, and all else being equal, the additional tax revenues to government are easily predicted. The dynamic view of taxation is that a tax changes people’s behavior, sometimes dramatically so, and sometimes when a new tax is enacted the result is even LESS revenues to government. Those subscribing to the static view of taxation cover their ears and never, ever want to consider this possibility, it is basically impossible in their mind, taxes go up, revenues go up, and that’s that.
Perhaps the minds of many are structured to force them to think in terms of simple cause and effect rather than more dynamically, and their views on taxation infect other endeavors such as computer modeling of the climate.

Nick Stokes
September 15, 2013 5:51 am

Tom in Florida says: September 15, 2013 at 5:29 am
“What is the estimated gigatons of the entire atmosphere?”

500,000. Basically, surface atm pressure * surface area. Varies a bit with humidity.
papertiger says: September 15, 2013 at 5:27 am
“About 3000 gigatons? So you don’t really know how much co2 is naturally in the atmosphere to less than four decimal places, even when talking gigatons.
Bet you anything you care to name that the “over 30 gigatons a year” is only a guess as well.”

The CO2 arithmetic is simple – 500,000 * 400 ppmv, and then a molecular weight calc. Emissions were here; 32.578645 GTons in 2011, if you like a lot of decimals. Burning C costs money, so these numbers come from accountants, not scientists.

Robin Kool
September 15, 2013 5:57 am

Hi Willis. I like your post as usual. Well thought through, and well said – as usual.
Your theory of emergent weather phenomena as governors of the Earth’s temperature is stimulation and merits rigorous testing.
There need to be peer reviewed articles to present it to the scientific world.
I imagine it will be a hard fight, but that seems to me the only way it can find it’s way into the great debate that is raging in the world.
Your question: “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
I’d say, Striing Theory.
It has been going on for decades now and seems to be no nearer to testable predictions. So in Karl Popper’s science philosophy, it is still pre-scientific.

Nick Stokes
September 15, 2013 5:58 am

“At that time, the range was said to be from 1.5° to 4.5°.”
That was the Charney report, 1979. They said of their estimate:
“These are at best informed guesses, but they do enable us to give rough estimates to the probable bounds for the global warming.”
They guessed well.

DaveS
September 15, 2013 6:01 am

“And the range that you ridicule is partly a recognition that the relation is indeed complex.”
So you agree with Judith Curry, presumably, that the IPCC’s claim that certainity has increased from 90% to 95% is nonsense?
As a taxpayer, I have to say that, given the amount of public money ‘climate scientists’ have burned up since 1979, your collective failure to to improve this range doesn’t look good. Maybe you have been barking up the wrong tree all this time.

TimTheToolMan
September 15, 2013 6:02 am

Nick notes “You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?”
Willis has shown that the model output is of the simple form Δ T = lambda ΔF with an unbelievably high correlation (0.97 or so if my memory serves). You’ve seen those posts, Nick. Do you have anything to add to his results? Or are you merely fishing for an appeal to authority (or lack thereof)

September 15, 2013 6:03 am

KevinM:
At September 15, 2013 at 5:23 am you quote Willis having written

Now consider: the first estimate was done in 1980, using a simple computer and a simple model. Since then, there has been a huge, almost unimaginable increase in computer power. There has been a correspondingly huge increase in computer speed. The number of gridcells in the models has gone up by a couple orders of magnitude. Separate ocean and atmosphere models have been combined into one to reduce errors. And the size of the models has gone from a few thousand lines of code to millions of lines of code.

And you reply

One would expect the same result if they were right the first time. A million variable monte carlo simulation of the apple and the tree would not have much effect on gravity.

I have come to expect warmunists will display scientific illiteracy and inadequate reading comprehension skills. Congratulations! Your comment displays both.
Willis said, and you quoted,

Separate ocean and atmosphere models have been combined into one to reduce errors.

Got that, KevinM? Willis was talking about ATTEMPT TO REDUCE ERRORS.
In other words, Willis’ article is about precision. And the errors (note that, KevinM, the errors) provide a range of “1.5° to 4.5°C” as Willis also says in his article. And that range has not altered in 30 years.
So, the errors have not been reduced in 30 years and after all the expensive effort you have quoted Willis reporting. The range “1.5° to 4.5°C” is as imprecise as it was before all that time money and effort.
KevinM, your comment says you failed to understand what you quoted.
Your comment talks about “the same result”. In other words, it is about the determined accuracy.
KevinM, precision and accuracy are different things.
I think the IPCC determination of climate sensitivity is grossly inaccurate: I think it is less than 1.0°C. Willis’ article suggests he also thinks the value determined by the IPCC is inaccurate and high. But it does not say that. It discusses the unchanged precision of the IPCC determined value. You have failed to understand what he wrote and you have displayed ignorance of the difference between precision and accuracy.
Richard

Editor
September 15, 2013 6:07 am

The canonical (incorrect) equation is:
Δ T = lambda ΔF
where T is temperature, F is forcing, lambda is the climate sensitivity, and Δ means “the change in”.”
You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?

The equation is what you get if you translate “climate sensitivity” into a mathematical expression (where forcing is the log₂(ratio of CO2 concentrations). “Climate sensitivity” a frequently used expression that climate scientists don’t bother to explain it much any more, e.g. James Hansen says in http://arxiv.org/abs/0804.1126 :

Paleoclimate data show that climate sensitivity is ~3 deg-C for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6 deg-C for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica.

(That paper was just the first applicable reference Google suggested.)
I’m surprised you’re unfamiliar with the concept. 🙂

thingadonta
September 15, 2013 6:08 am

“So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?”
I just watched a documentary on the Concordia cruise line disaster. Despite the latest state of the art technology, the most sophisticated navigation systems, despite a crew of around 1000, the cruise liner crashed into a well known and well charted outcrop of rock marked on any standard chart of the area.
The reason? Entirely human error. The captain took the ship deliberately off course to do a favour for a colleague, who wanted to show off the ship to his friends on a nearby island.
Doesn’t matter what state of the art technology one has, if someone wants to completely ignore it, it doesn’t make any difference.
And his excuse? This outcrop of rocks didn’t show up on his maps of the area he had on the ship. But on any cruise course, maps which are not part of the scheduled trip are not required to show any sufficient amount of detail, because the ship is not supposed to be in such an area to begin with.
I suppose climate scientists would say, ‘well we didn’t know enough details of natural climate changes in our models, so how are we supposed to have predicted what would happen? ‘

Nick Stokes
September 15, 2013 6:10 am

TimTheToolMan says: September 15, 2013 at 6:02 am
“Willis has shown that the model output is of the simple form Δ T = lambda ΔF with an unbelievably high correlation (0.97 or so if my memory serves).”

So it is Willis’ canonical (wrong) equation?

TimTheToolMan
September 15, 2013 6:11 am

Willis writes “The exact same thing is going on with the climate. It is governed by a variety of emergent climate phenomena such as thunderstorms, the El Nino/La Lina warm water pump, and the PDO.”
My instinct also tells me this is so. And importantly for the modellers its not something you can model. Being an emergent property of the climate, it must also be an emergent property of the model to have any chance at all of being able to reflect associated changes in climatic processes.

AlecMM
September 15, 2013 6:11 am

Willis is wrong as is anybody who believes the incorrect IR physics used in Climate Alchemy.
The reason is that there is zero net CO2 15 micron band IR emission from the Earth’s surface, simple radiative physics.

TimTheToolMan
September 15, 2013 6:15 am

Nick writes “So it is Willis’ canonical (wrong) equation?”
I wouldn’t know. But you say its wrong and yet correlates closely to the model output. There is a conclusion one may be able to draw from that statement.

Nick Stokes
September 15, 2013 6:15 am

Ric Werme says: September 15, 2013 at 6:07 am
“The equation is what you get if you translate “climate sensitivity” into a mathematical expression (where forcing is the log2(ratio of CO2 concentrations).”

Then a lot was lost in translation. Let’s hear the original. That’s not what it says.

Resourceguy
September 15, 2013 6:15 am

Its the inverse of Moore’s Law raised to the third power by the money term.

Stacey
September 15, 2013 6:16 am

Mr Stokes
I didn’t expect a straight answer from you presumably because you are incapable of answering my question?

Jim Cripwell
September 15, 2013 6:17 am

I think someone else wrote this before me, but please wake me up when the estimate of climate sensitivity is indistinguishable from zero.

Nick Stokes
September 15, 2013 6:18 am

TimTheToolMan says: September 15, 2013 at 6:15 am
“I wouldn’t know. But you say its wrong…”

No, I was quoting Willis. Well, he said “incorrect”.

September 15, 2013 6:18 am

Nick Stokes:
Your several posts in this thread display your usual ignorance and – also as usual – present irrelevance.
This thread is NOT about how much CO2 people are emitting to the atmosphere.
This thread is about failure to improve determination of climate sensitivity over 30 years.
Also, contrary to your assertion, there is no reason to think Charney et al. “guessed well” when they originally guessed the climate sensitivity range adopted by the IPCC.
Empirical – n.b. not model-derived – determinations indicate climate sensitivity is much less than the lower bound of the IPCC estimate: they indicate climate sensitivity is less than 1.0°C for a doubling of atmospheric CO2 equivalent.
This is indicated by the studies of
Idso from surface measurements
http://www.warwickhughes.com/papers/Idso_CR_1998.pdf
and Lindzen & Choi from ERBE satellite data
http://www.drroyspencer.com/Lindzen-and-Choi-GRL-2009.pdf
and Gregory from balloon radiosonde data
http://www.friendsofscience.org/assets/documents/OLR&NGF_June2011.pdf
Please provide another post if and when you can think of something to say which is on topic and sensible.
Richard

sailor1031
September 15, 2013 6:26 am

A general principle regarding taxes, held since roman times I do believe, was that people will pay taxes when either there is a sword at their throats (metaphorical these days, the IRS has other weapons) or paying the tax is less than the cost of avoidance.

September 15, 2013 6:29 am

Nick Stokes asks “You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?”
Here’s one that is close: http://www.people.fas.harvard.edu/~phuybers/Doc/Ockham.pdf See equation 1 where the change in the temperature of the atmosphere is calculated from the land and ocean surface temperatures plus a forcing (Fc). They would rather not have to separate out forcings and feedbacks, so they lump them all together and use the actual temperatures to determine that parameter. Their land and ocean surface temperatures come from short wave forcing, specifically seasonal short wave.
The admit their model doesn’t work at all in the tropics which lacks seasonal short wave forcing changes. In other words the simplistic models that determine temperature from forcing changes like CO2 changes and seasonal solar changes do not work when the temperature is governed as Eschenbach explained here: http://wattsupwiththat.com/2009/06/14/the-thermostat-hypothesis/
Looking at the earth as whole (as Lindzen has explained many times) the global average temperature is constrained mostly by polar heat transport. The earth absorbs short wave and radiates long wave which is modulated by CO2 and water vapor in various forms. However the global average temperature is not determined by this balance, but by the movement of heat to the poles where it is lost. Here’s a paper Lindzen wrote about that http://www-eaps.mit.edu/faculty/lindzen/prggclhttr.pdf written around the time that Trenberth was heading into the rathole of global (or local) energy balance models.

Rick Bradford
September 15, 2013 6:32 am

*The models are on the wrong path.*
It seems clear that they are modelling noise, not signal, the most common sin of incompetent statisticians.

Gary Pearse
September 15, 2013 6:35 am

Nick Stokes says:
September 15, 2013 at 4:27 am
Don’t pretend you haven’t heard of the “Eschenbach Effect”. It’s peer reviewed, and as you can appreciate, if a skeptic gets something published in climate science, it has run a severe gauntlet to get there. I can’t believe you haven’t read it and the other related articles here.
It is telling if you haven’t. Can you explain why no matter how high the temperature may go, ocean temperatures cannot (do not!) exceed 31ºC, and often hover within a couple of degrees below and up to the 31ºC number?
I was blown away by the statement of Willis’s that climate models have had millions of lines written for them! Eventually you will be forced to let go of the C-phlogiston2 theory of climate, and will see that enthalpy changes in H2O and all its phases, plus convection and clouds overpower all else. Even if the ECS was 10 or 20, the Eschenbach Effect would operate to counteract the temperature rise.
http://wattsupwiththat.com/2009/06/14/the-thermostat-hypothesis
Imagine a system with a SST max of 31ºC, and at the poles, ice, and variable temperatures. If the planet warms significantly, the ITZ is near 31ºC on average and the polar areas warms up (polar amplification is easier to understand with the governor system); ice melts, etc.
Now, imagine a cooling world where the polar amplification goes the other way, cooling down dramatically. The water and air exchange with the tropics results in a drop of tropical SST to something, say, in the mid-twenties. ITZ cumulus formation arrives later and later in the day, to a point that they do not form at all, giving full force to the sun’s unimpeded insolation to the surface, to try to get the ITZ temperature up.
Depending on how cold the polar regions are, the SST may not be able to rise much — this is when all stops are open to incoming solar to no great avail, and we go into an ice age. I can imagine a string of temperature buoys that move along with the ITZ through the seasons, being all we need to tell which way the climate is going over the globe. No mention of CO2 here. The Thermostat (operating on the well known thermodynamics of water being heated and cooled) doesn’t care what is causing the temporary forcing. The forcing involved in stoking the climate steam engine to the limit is counteracted by changes in water’s phases of liquid-vapour plus work being done to use it up, plus cloud formation and thunderstorms to stop the stoking from being overdone.
Nick, don’t end up being one of the 100 scientists against the modern “General Relativity” of climate.

Nick Stokes
September 15, 2013 6:41 am

Eric1skeptic says: September 15, 2013 at 6:29 am
‘Nick Stokes asks “You’ve said this before, but again not quoting anyone. Whose ludicrous idea is it? Whose canonical equation? What did they say?”
Here’s one that is close:’

Well, it’s hardly close. There are four simultaneous equations, with temperature derivatives as well as temperatures, and several forcing terms. And they call that a minimalist model.

Richard111
September 15, 2013 6:44 am

There is no such thing as ‘greenhouse gases’. A GAS absorbs and radiates energy, proportional to its local thermal limits. IT CANNOT STORE ENERGY. Now nitrogen, oxygen and argon — gases which make up 99.9% of the atmosphere — do not absorb or radiate. They are considered TRANSPARENT.
So what happens to that energy once those gases have been heated? The gas expands, and rises. Has it cooled? NO, IT HASN’T! But those gases can pass energy by conduction to the ‘greenhouse gases’ which can then RADIATE that energy away. A small amount of energy can be passed to a cool surface but the cooled gas molecules will insulate much further energy transfer unless there is considerable turbulence near the surface. Cooled air sinks.
There you have it. Only the major atmospheric gases that must be warmed by conduction can store that heat energy until the so called ‘greenhouse gases’ radiate the energy away.
The wrong gases have been blamed for the ‘greenhouse effect’. If it wasn’t for those trace gases radiating away to space, the atmosphere would be so hot life would never have moved out of the sea.

Tom in Florida
September 15, 2013 6:44 am

Tom in Florida says: September 15, 2013 at 5:29 am
“What is the estimated gigatons of the entire atmosphere?”
Nick Stokes says:
September 15, 2013 at 5:51 am
“500,000. Basically, surface atm pressure * surface area. Varies a bit with humidity.”
So basically about .6% is CO2 and the CO2 we add annually is about .006% of the atmosphere (30 gigatons). Considering better living conditions, life style and conveniences, I can live with that very happily and without remorse. So can my kids, my grand kids, my great grand kids, my great great grand kids and everyone after that.

September 15, 2013 6:49 am

Dear Mr. Eschenbach:
The reason the reason growth in computing power hasn’t affected the accuracy of model output is that a mistake is a mistake – breaking it into smaller pieces doesn’t affect its nature. If y is a function of some set of x sub i, then increasing i has no positive effect on accuracy of output if your functional specification is wrong to begin with.
–and, on an unrelated topic: cruise contorl is not a governer in newtonian physics. gas consumption varies with terrain, crosswinds, and so on. A better example would be the role of wetlands in reducing the downstream effects of upstream flooding – however I (and I assume most others) take your point: discount the limiters and you can imagine a runaway climate vehicle, but only if you have no idea how it works.

September 15, 2013 6:58 am

Nick, it is a nice example of why the complex GCMs have failed, so in that sense you are right. The main difference from the GCMs is that they lumped all the parameters into a single parameter. Typically the GCM will parameterize some of the weather and model some of it. Here they placed it all in a single parameter to determine by regression with empirical data. That is why this model outperforms the complex GCMs, it simply doesn’t pretend that the “physics” of weather can be modeled.
But in fact equation 1 is their complete model, and the other 3 equations are included only to provide the one month seasonal lag from land and ocean surface thermal inertia.

edcaryl
September 15, 2013 7:00 am

I say again, has anyone run a climate model with a LOW climate sensitivity?

Editor
September 15, 2013 7:07 am

Nick says: “Why is El-Nino a forcing?”
I don’t believe Willis said ENSO was a forcing. He listed it under one of the governing factors.
Regards

wws
September 15, 2013 7:12 am

“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
Oh, I can think of quite a few! UFOlogy, Sasquatchology, Phrenology, the Pernicious Evil of Chemtrails, and Who Was on the Grassy Knoll?

Bill Illis
September 15, 2013 7:15 am

What’s interesting is that in 1979, there were two groups doing the early modelling. One managed by Syukuro Manabe at the NOAA’s GFDL lab and one managed by James Hansen at NASA’s GISS. Hansen got Manabe pushed the sidelines somehow immediately after this time, the start of maintaining/bullying the consensus.
Manabe’s models had the doubling sensitivity at 2.0C, 3.0C and 3.0C. Hansen’s models were 3.5C and 3.9C. The earliest consensus number as a result of these five models was 3.0C with an uncertainty of +/- 1.5C and, thus, the earliest range was given as 1.5C to 4.5C.
The Charney Report from 1979 outlines the science at the time (just 22 short pages). Its amazing how little the science has changed from this report. And it certainly talks a lot about the points Willis raised with the forcing lambda shortcut’s which took everyone down the wrong path.
Have a read of the Charney Report. Its short, provides a better explanation of the theory than you will see from any climate scientist and it highlights how little has changed. You’d think we would try to collect some data regarding the main uncertainties to try to reduce them rather than just drumbeat “warming” over and over again for 34 years.
http://www.atmos.ucla.edu/~brianpm/download/charney_report.pdf

September 15, 2013 7:17 am

edcaryl:
At September 15, 2013 at 7:00 am you ask

I say again, has anyone run a climate model with a LOW climate sensitivity?

OK. I will answer that yet again, and I hope all who are bored with this answer will skip over it and forgive my posting it again.
The models cannot be run with low or zero climate sensitivity because they would be unstable.
I yet again explain this as follows.
None of the models – not one of them – could match the change in mean global temperature over the past century if it did not utilise a unique value of assumed cooling from aerosols. So, inputting actual values of the cooling effect (such as the determination by Penner et al.
http://www.pnas.org/content/early/2011/07/25/1018526108.full.pdf?with-ds=yes )
would make every climate model provide a mismatch of the global warming it hindcasts and the observed global warming for the twentieth century.
This mismatch would occur because all the global climate models and energy balance models are known to provide indications which are based on
1.
the assumed degree of forcings resulting from human activity that produce warming
and
2.
the assumed degree of anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature.
More than a decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.
The input of assumed anthropogenic aerosol cooling is needed because the model ‘ran hot’; i.e. it showed an amount and a rate of global warming which was greater than was observed over the twentieth century. This failure of the model was compensated by the input of assumed anthropogenic aerosol cooling.
And my paper demonstrated that the assumption of aerosol effects being responsible for the model’s failure was incorrect.
(ref. Courtney RS An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).
More recently, in 2007, Kiehle published a paper that assessed 9 GCMs and two energy balance models.
(ref. Kiehl JT,Twentieth century climate model response and climate sensitivity. GRL vol.. 34, L22710, doi:10.1029/2007GL031383, 2007).
Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model. This is because they all ‘run hot’ but they each ‘run hot’ to a different degree.
He says in his paper:

One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.
The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy.
Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at http://www.nature.com/reports/climatechange, 2007) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.

And, importantly, Kiehl’s paper says:

These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.

And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.
Thanks to Bill Illis, Kiehl’s Figure 2 can be seen at
http://img36.imageshack.us/img36/8167/kiehl2007figure2.png
Please note that the Figure is for 9 GCMs and 2 energy balance models, and its title is:

Figure 2. Total anthropogenic forcing (Wm2) versus aerosol forcing (Wm2) from nine fully coupled climate models and two energy balance models used to simulate the 20th century.

It shows that
(a) each model uses a different value for “Total anthropogenic forcing” that is in the range 0.80 W/m^-2 to 2.02 W/m^-2
but
(b) each model is forced to agree with the rate of past warming by using a different value for “Aerosol forcing” that is in the range -1.42 W/m^-2 to -0.60 W/m^-2.
In other words the models use values of “Total anthropogenic forcing” that differ by a factor of more than 2.5 and they are ‘adjusted’ by using values of assumed “Aerosol forcing” that differ by a factor of 2.4.
So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.
Richard

beng
September 15, 2013 7:19 am

***
Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.
***
And yet, so many “scientists”, so much money, so many computer models, so much media & political frenzy, and so many new regulations & restrictions.

Vince Causey
September 15, 2013 7:19 am

Pesadia,
“Not much has happened in physics, unless you believe that the god particle has been discovered,
which I don’t”
I was going to say the ITER project, but I agree, physics is a better example because it is pure science. Physics has been dwelling on String theory for the last 30 years and has gotten nowhere. Steven Hawking is still looking for his theory of everything, and quantum theory still cannot be reconciled with GR.

Sheffield Chris
September 15, 2013 7:21 am

So Mr Stokes
3000 G Tonnes currently, 30 G Tonnes being added annually by man kind.
Can I ask how much is produced through natural factors and how much is absorbed by natural cycles ?

James Strom
September 15, 2013 7:26 am

Willis, your theory of thunderstorms as governors is impressive. I wonder what kind of empirical evidence it would take to confirm it, or do you think of it as already confirmed? Seems to me that we have two types of candidates to explain recent non-warming, one, that heat is being vented into deep space (which I think is essentially yours) and the other, that heat is being vented into deep oceans. Although the second seems to me to be a piece of ad-hockery, it’s possibly true. Do we presently have the kind of observational capability that could decide either of these theories?

richard verney
September 15, 2013 7:31 am

Nick Stokes says:
September 15, 2013 at 4:49 am
//////////////////
AND? So WHAT?
Whilst on the point:
1.how many tonnes of GHGs do ants and termites emit into the atmosphere each year?
2. has the quantity they emit been steady since the pre-industrial era, or has it in fact increased these past 150 years or so?

richard verney
September 15, 2013 7:37 am

richardscourtney says:
September 15, 2013 at 7:17 am
/////////////
This follows from the fact that just when CO2 emissions began to rise significantly (circa 1940s) rather than there being a corresponding temperature increase (which CO2 GHG theory requires), there was a temperature decline until about the mid 1970s.
This anomalous result required a negative forcing greater than the positive forcing of CO2 – hence the application of aerosols and highly neagative forcings associated with them. None of this was based on real observational data, nor upon scientific experiment. just a fudge.

John West
September 15, 2013 7:38 am

“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.
So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?”

While your answer is true, I think it’s incomplete. It’s “consensus science” or the politicization of science or however you want to term the hijacking of science for a particular agenda that has ensured that only one path could be explored with any reasonable support. In real science progress is often made through failure, but failure is not an option in climate science. Failure would undermine the support for action (the cause); therefore there can be no admitted failures. Everything must be “consistent with” the theory. Models, no matter how useless, cannot be termed “failures” but are averaged into ensembles; lessons therefore cannot be taken from the failures and applied to new and improved models. Since the models are purported to be useful and the important “forcings” are said to be well known hence the certainty of the need for action, only minor tweaks can be made to them or one is guilty of high treason against the consensus. Anyone who commits high treason against the consensus can kiss his/her career goodbye, therefore only the retired or nearly retired scientists or those outside of the climate science “community” (that are the least likely to be actively engaged in climate science research) can possibly challenge said consensus. The more senior scientists can be easily marginalized as being behind current research or even portrayed as senile and those outside of the climate science clique can similarly be dismissed publicly. It’s a sad state of affairs.

John G.
September 15, 2013 7:47 am

Great analogy climate analogized to a speed governed vehicle. You have to be careful assigning the corresponding parts though. Obviously temperature is analogized to speed but at first I wanted to analogize the sun or insolation to fuel flow then I realized nothing on earth could control that flow like a governor controls fuel flow. So I decided the sun is more appropriately likened to the environmental factors, e.g. hills, wind speed and direction, road friction etc. that tend to speed up or slow down a car like sun/insolation fluctuations would tend to heat up or cool the climate. Then the car speed governor must be compared to the various climate feed backs, e.g. cloud formation, convection and heat transport, particulates, ocean oscillations, other albedo factors etc. that in combination govern the flow of heat, i.e. flow of fuel. It works for me. Thanks.

Alan Watt, Climate Denialist Level 7
September 15, 2013 7:56 am

Two things. 1) typo alert “the El Nino/La Lina warm water pump” should be “La Nina”.
2) “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.” I nominate fusion research — we’ve been 10 to 20 years away from limitless fusion energy for the past 40 years now. Probably spent more money there than in climate research.

September 15, 2013 7:58 am

richard verney:
re your post at September 15, 2013 at 7:37 am.
Yes, quite so. And it is that balance between warming and cooling periods combined with the different amounts of ‘run hot’ of each climate model which define the unique climate sensitivity of each climate model.
Richard

minarchist
September 15, 2013 7:59 am

It is also likely that Co2 is an emergent phenomenon of climate, specifically temperature over time, being the integral thereof, as Prof. Salby has explained.

Chuck Nolan
September 15, 2013 8:02 am

∆S = lambda ∆G
where S is speed, G is gas, and lambda is the coefficient relating the two.
————————————————
The problem is they do not know the value nor the makeup of Lambda.
In the car example everything from wind, tire pressure, altitude, temperature, humidity,
up and down hills, road friction and wet or icy roads etc go into the variable “Lambda.”
In the case of climate the value and make up of Lambda is even less known.
Worse of all, Lambda’s value is dynamic so changes in certain of its components causes changes in some of its other components.
eg
car: increase temp causes increased road friction
and
cagw: increase temp causes increased cloud
And they want to tell me the ∆t to 1/100°
Note, at lease in the car example the system is a little less chaotic, maybe.
cn

September 15, 2013 8:09 am

At last the secrets of environmental computation.
http://en.wikipedia.org/wiki/The_Turk

mbur
September 15, 2013 8:10 am

…sun is the gas tank, water is the engine ,and physics is the switch.
http://en.wikipedia.org/wiki/File:Phase_diagram_of_water.svg
Thanks for the interesting articles and comments.

Matthew R Marler
September 15, 2013 8:21 am

Note that this is NOT a feedback,
The governor is not a feedback? The cruise control certainly is a feedback.

Steve Oregon
September 15, 2013 8:22 am

Willis said
“So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?
I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.”
I’d add that the path was chosen and the scientific pursuit is an attempt to prove the path correct without any regard for or amendment to signs that it is not. It’s even worse if detected errors are deliberately ignored. .
If were on a drive north from Portland to Seattle and I passed the California boarder, Lake Shasta signs, the turn off to Sacraments and all sorts of California road signs I should conclude I’m on the wrong path.
But if my entire career, credibility and agenda were dependent upon my trip to Seattle I’d eventually run out of BS.

Norm
September 15, 2013 8:25 am

Why is the IPCC/Gore/Obama/Hansen pushing to reduce CO2 if the problem is (supposedly) forcing? Why aren’t they ranking the various forcings and going after those? Their approach is 100% ass backwards, CO2 isn’t the problem.

September 15, 2013 8:26 am

We haven’t cured cancer or the common cold, we haven’t made a fuel better than gasoline, people still get Alzheimer’s, I still don’t have a flying car, there are many many fields of human endeavour where little progress has been made in a lot longer than 30 years. Climate “science” proves nothing, predicts nothing that occurs, establishes no new principles, and cannot prove or disprove its central hypothesis that CO2 alters the climate.
Some science, what would we do without them….

Pamela Gray
September 15, 2013 8:33 am

Willis, is your post a model or an equation? A model uses equations to search for the way in which something actually works now, in the past, and in the future. Does your equation function in all these ways and how good is it at hindcasting? Run the model as it currently is stated and tell us what direction you think growing conditions are heading in say 20 years. Run it backwards and tell us that it could have projected the awful freezing temperature in the 50’s and 70’s. Can’t? Will you have to add something to your model to do that or is it ready to go? Would your model have worked to warn farmers of the dustbowl? Would it have worked to encourage preparations for more diversity in farming production during the medieval warm period?
Regarding models, many people also panned efforts to go to the moon, saying it couldn’t be done. One or two scientists trying to find ways to do that privately thought it could not be done. Yet it was done, and they used models to come up with the best chance of being successful. Several of those models probably would not have worked. But the fact that they used models was wrong? And get this: They were using models to predict something that hadn’t happened yet and in which there was a good chance that someone was going to die! Now that was courageous! Were they foolish?
I think it is possible to eventually get these models right but it will take a long time. Why? This is a “duh” moment: We have to compare the model to the rather noisy observation and it takes a while to see whether or not there is a match. You are complaining about something that could not have been avoided. That’s right. They could not have avoided this length of time. Say What? You wanted them to have been correct within the first 5 years? How could that have been possible?
You must be part of the now generation. Here is a model of someone who wants what they want now and if they don’t get it, someone is wrong. You want someone to project what you want for breakfast. You want them to be right the first time. And you want it served at 6:00 AM sharp. If it doesn’t happen, the breakfast provider is wrong and they should lose their job. hmmm. Maybe the world should have fired Edison early in his career. And all the people that came before and after him regarding several model and subsequent invention improvements we now enjoy today. They took too long and wasted too much money.
But in reality, Science just seems to work this way. It takes too long and wastes too much money before they get it right. Yes it makes us mad and we want to throw the bums out. Good thing we weren’t successful in the past.

Bill Illis
September 15, 2013 8:39 am

The average of UAH and RSS are now lower than even the lowest climate model predictions from IPCC AR4. The lowest models have an in-built assumption of 2.2C per doubling (versus 3.0C in the average) and the lower troposphere is supposed to warm at 1.3 times the surface. So that should tell you something.
http://s24.postimg.org/uk2g52uol/IPCC_AR4_vs_HCrt4_UAH_RSS_Aug_2013.png

mrmethane
September 15, 2013 8:41 am

Feedback? Cruise control USES feedback to maintain speed in response to increases and decreases in measured speed about a SET-POINT. The nature of the feedback LOOP may or may not include delays to effect smoothing, hysteresis (“dead-band”), ramping, anticipation and other elements to enhance effectiveness in concert with comfort. Feedback is what happens INSIDE the control system, which has inputs and outputs, the outputs being RESULTS of the feedback “transfer functons” inside. A human driver adjusts the vehicle throttle (output) as a result of becoming aware of a speed “error signal”, with the feedback being part of the mental process used to decide on a throttle adjustment. May be a subtle point but important. In a simple cloud system, internal POSITIVE feedback would result in reduced cloud cover in response to rising temp, while NEG would effect an INCREASE cloud cover when the temps rise.
Feedback is part of the “transfer function”.

Scott Basinger
September 15, 2013 9:00 am

Pamela Grey writes: “But in reality, Science just seems to work this way. It takes too long and wastes too much money before they get it right. Yes it makes us mad and we want to throw the bums out. Good thing we weren’t successful in the past.”
The fact that there are refinements and discoveries yet to be made in order to have models which are less wrong doesn’t bother me all that much. What does bother me is that many of the scientists involved are playing a game. The game is ‘sell fear’ andmany who should honestly know better have beat the drum of consensus in order to keep the money coming.
I suspect that many in the field who have personally beat this drum are painfully aware that their models have been terrible at prediction and that many physical processes that they have assumed to be a certain way may in fact be the opposite. Those voices they’ve tried to squelch and badmouth (even as they had to redefine what the peer reviewed literature was) such as Roy Spencer may have been right all along about water vapour feedback. I bet it irks them that someone they have tried to paint as a creationist relgious zealot turns out to be a more adept seeker of truth than those of the True Consensus Faith.
As for Willis, the criticism over your equation is valid. You would do well to admit it lest you appear as inflexible and incapable of change as your oft unworthy opponents.

September 15, 2013 9:23 am

richardscourtney says: September 15, 2013 at 7:17 am

So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.
Excellent comment, thank you Richard.
Parties interested in the fabrication of aerosol data to force-hindcast climate models (in order for the models to force-fit the cooling from ~1940 to ~1975, in order to compensate for the models’ highly excessive estimates of ECS (sensitivity)) may find this 2006 conversation with D.V. Hoyt of interest:
http://www.climateaudit.org/?p=755
Douglas Hoyt, responding to Allan MacRae:
“July 22nd, 2006 at 5:37 am
Measurements of aerosols did not begin in the 1970s. There were measurements before then, but not so well organized. However, there were a number of pyrheliometric measurements made and it is possible to extract aerosol information from them by the method described in:
Hoyt, D. V., 1979. The apparent atmospheric transmission using the pyrheliometric ratioing techniques. Appl. Optics, 18, 2530-2531.
The pyrheliometric ratioing technique is very insensitive to any changes in calibration of the instruments and very sensitive to aerosol changes.
Here are three papers using the technique:
Hoyt, D. V. and C. Frohlich, 1983. Atmospheric transmission at Davos, Switzerland, 1909-1979. Climatic Change, 5, 61-72.
Hoyt, D. V., C. P. Turner, and R. D. Evans, 1980. Trends in atmospheric transmission at three locations in the United States from 1940 to 1977. Mon. Wea. Rev., 108, 1430-1439.
Hoyt, D. V., 1979. Pyrheliometric and circumsolar sky radiation measurements by the Smithsonian Astrophysical Observatory from 1923 to 1954. Tellus, 31, 217-229.
In none of these studies were any long-term trends found in aerosols, although volcanic events show up quite clearly. There are other studies from Belgium, Ireland, and Hawaii that reach the same conclusions. It is significant that Davos shows no trend whereas the IPCC models show it in the area where the greatest changes in aerosols were occurring.
There are earlier aerosol studies by Hand and in other in Monthly Weather Review going back to the 1880s and these studies also show no trends.
So when MacRae (#321) says: “I suspect that both the climate computer models and the input assumptions are not only inadequate, but in some cases key data is completely fabricated – for example, the alleged aerosol data that forces models to show cooling from ~1940 to ~1975. Isn’t it true that there was little or no quality aerosol data collected during 1940-1975, and the modelers simply invented data to force their models to history-match; then they claimed that their models actually reproduced past climate change quite well; and then they claimed they could therefore understand climate systems well enough to confidently predict future catastrophic warming?”, he close to the truth.”
_____________________________________________________________________
Douglas Hoyt:
July 22nd, 2006 at 10:37 am
MacRae:
Re #328 “Are you the same D.V. Hoyt who wrote the three referenced papers?”
Hoyt: Yes
.
MacRae: “Can you please briefly describe the pyrheliometric technique, and how the historic data samples are obtained?”
Hoyt:
“The technique uses pyrheliometers to look at the sun on clear days. Measurements are made at air mass 5, 4, 3, and 2. The ratios 4/5, 3/4, and 2/3 are found and averaged. The number gives a relative measure of atmospheric transmission and is insensitive to water vapor amount, ozone, solar extraterrestrial irradiance changes, etc. It is also insensitive to any changes in the calibration of the instruments. The ratioing minimizes the spurious responses leaving only the responses to aerosols.
I have data for about 30 locations worldwide going back to the turn of the century.
Preliminary analysis shows no trend anywhere, except maybe Japan.
There is no funding to do complete checks.”

John Norris
September 15, 2013 9:25 am

re: “… after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest.”
I’m betting we passed “millions of dollars” somewhere back in the 80’s.
Another item where the count has gone up is the number of models and modelers. Your not a global team contributor if your country’s scientists don’t have a model input for the IPCC. Of course this further supports the theme of your post. Despite the increased capacity and effort, there is no apparent improvement in our ability to predict the future. Their simulations all overshot reality.

September 15, 2013 9:27 am

Friends:
I write in hope of providing some clarity to the issues of modelling introduced to this thread by Pamela Gray at September 15, 2013 at 8:33 am and discussed by Willis Eschenbach in a series of subsequent posts.
First, I state some basic modelling principles.
A model is a simplified representation of reality.
Being simplified, no model is an exact emulation of reality; i.e. no model is a perfect and no model is intended to be perfect.
A model is constructed for a purpose.
For example, a model of heat loss from a cow may assume that a cow is shaped as a sphere with the surface area of a real cow. And this simple model may provide an adequate quantitative indication of how heat loss from a cow varies with the cow’s metabolic rate. Thus, this hypothetical model may be very useful.
But that model of a cow cannot be used to indicate the movements of a cow. A model of a cow which includes legs is needed for that.
Another model of a cow may be constructed purely for the pleasure of the modeller. In this case it may be carved from wood and painted.
Possible purposes for models are infinite.
A model may have many forms.
It may be physical, abstract, algebraic, numeric, pictorial or an idea. If its form fulfils the desired usefulness then it is an appropriate model; i.e. it can fulfil its purpose.
In the context of the discussion between Pamela and Willis, there are two questions of importance because these questions are ALWAYS important when considering any model.
Question 1. What is the purpose of the model?
Willis explains this when he says

My model is a very simple model, which emulates the global temperature results of the climate models with extremely high fidelity.
{snip}
But then, what would you expect from a model (either mine or a GCM) which merely outputs a lagged, resized version of the inputs?

In other words, the purpose of his model is to determine the form of GCM outputs.
Question 2. Does the model fulfil its intended purpose?
Again, Willis explains this saying

My model is a very simple model, which emulates the global temperature results of the climate models with extremely high fidelity. Regarding the global temperature, it can do everything that the climate models can do … which, as my model shows, is nothing.
But then, what would you expect from a model (either mine or a GCM) which merely outputs a lagged, resized version of the inputs?

In other words, his model has fulfilled its purpose by determining the form of GCM outputs is “lagged, resized version of the inputs” and, thus, has demonstrated the GCMs output “nothing” of value.
Summary
Pamela seems to consider all models as being numerical models which are required to hindcast and to forecast. Of course, GCMs are numerical models with the purpose of hindcasting and forecasting climate. But Willis’ model is mathematical (n.b. not numerical) and was constructed to assess whether GCMs fulfil their purpose, and it shows they don’t.
Richard

Pamela Gray
September 15, 2013 9:29 am

Willis, I simply used a vignette to reveal a type of response similar to those portrayed in comments and the tone of your post. Vignettes are useful in that way. Which is, they got it wrong, throw the bums out. And there is history, also a useful endeavor, that helps us examine the process science has taken when they got it wrong, and wrong, and wrong again. Which together tells us that there are other possibly more fruitful ways to respond to the current state of climate science modeling. Modeling will not go away. So maybe we need to focus on reasoned and plausible ways to improve the modeling.

David Riser
September 15, 2013 9:30 am

Thanks Willis,
Genius at work again. I would add that the governor works better than most folks will admit. The error bars on the historical temp record are such that there hasn’t been any provable significant long term warming over the entire record. There is some legitimate concern that the temp record has been fiddled with over the course of the last 30 or so years to make a case for warming. It is surprising that there is such consistency across a temp record that is created by taking the median value of monthly median values of annual median values and creating a trend through statistics.
As for physics not making any progress over 30 years…There has been a steady improvement in understanding of small processes which has filtered steadily to the practical application of those processes. So for instance every time someone says we can not do x we end up a few years later able to do it. One area this shows through is in the area of electronics, things continue to get smaller even after we thought that capacitance caused resistance would stop the shrink. Our understanding of these processes has improved.
As for no flying cars… that is not because we can’t make a car that flies, its because a flying car is a regulation nightmare. A flying car has been a reality for a very long time, jet engines are small enough now to fit in a motorcycle, this gives more than enough power to put a car in the air. Once you get more than a single car up though you have the possibility of a pretty horrendous accident.
v/r,
David Riser

David Riser
September 15, 2013 9:36 am

Richard,
One thing to add about General Circulation Models (GCM), they were not originally designed to model climate. They are weather forecasting tools, they do this well over a period of about 48 hours and not so well over 3 – 10 days. So the idea they will accurately predict climate is absurd. In order to do that they would need to be many orders of magnitude larger, i.e. smaller grid sizes, and they would have to have a better understanding of atmospheric physics.
v/r,
David Riser

Bloke down the pub
September 15, 2013 9:42 am

Willis Eschenbach says:
September 15, 2013 at 6:14 am
Bloke down the pub says:
September 15, 2013 at 4:19 am
But suppose we turn on the governor, which in a car is called the cruise control. At that point, the relationship between speed and gas consumption disappears entirely—gas consumption goes up and down, but the speed basically doesn’t change.
~~~~~
I don’t think this is a suitable example Willis, as the cruise control controls the speed by controlling the flow of petrol. Admittedly an automatic gearbox would also play a part but that’s a separate issue.
~~~~~
To the contrary, it is an excellent example, because the the clouds control the incoming energy based on the temperature, just as cruise control regulates the incoming gas based on the speed.
++++++++++++++
Willis, I agree with your last statement but that doesn’t seem to tally with the part of your original post that I’ve highlighted.

September 15, 2013 9:43 am

Friends:
I write to draw attention to the post of Allan MacRae at September 15, 2013 at 9:23 am.
It is VERY important, especially its concluding paragraph.
When you know that each GCM is fudged by a completely arbitrary – and unique value for each GCM – of an assumed aerosol forcing then you know WHY Willis’ model of GCM output shows the models are worthless.
Richard

Pamela Gray
September 15, 2013 9:47 am

Richard, I do not consider all models to be numerical. I have been a follower of statistical and dynamical ENSO models for quite some time. I am arm-chair hobby familiar with both numerical (input historical data) and dynamical (use mathematical equations to simulate dynamically the processes we think are happening) ENSO models. I am also arm-chair hobby familiar with climate models and how various scientists use them by driving them with various tunable and data inputs. Right now I am attempting to fine-tune your model of me by creating a hybrid suggestion. It oughta work better.

Bart
September 15, 2013 9:49 am

Allan MacRae says:
September 15, 2013 at 9:23 am
“Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth.”
I wouldn’t be too insistent on that point. It is an at-least somewhat chaotic system, and may randomly switch from one trajectory to another. That they all run hot is the key damning point.
As for the arbitrary aerosol fudge, it’s kind of like those experiments in high school chemistry class where your experiment didn’t quite work out right, (because say, you stashed some of the chemicals in your pocket for later experiments of your own) but you know the answer so you write it up as if everything went perfectly. They just knew what the answer had to be, and so felt justified taking shortcuts to arrive at it.

September 15, 2013 9:54 am

David Riser:
Thankyou for your post addressed to me at September 15, 2013 at 9:36 am.
Yes, what you say is right. And the issue you raise goes to the nub of Willis’ article.
As you say, the GCMs were developed from weather forecasting models. But an F1 racing car is developed from Daimler’s invention. As Willis says

To me, that’s the Occam’s Razor explanation of why, after thirty years, millions of dollars, millions of man-hours, and millions of lines of code, the computer models have not improved the estimation of “climate sensitivity” in the slightest. They do not contain or model any of the emergent phenomena that govern the climate, the phenomena that decouple the temperature from the forcing and render the entire idea of “climate sensitivity” meaningless.

In other words, the modellers have spent much effort developing an F1 racing car from a basic form when they should have been developing to obtain an F1 fighter plane.
The only reasonable conclusion is that there should be a fundamental rethink about how to model climate.
Richard

Joe Dunfee
September 15, 2013 9:55 am

The video ad started playing automatically. When an ad starts playing audio without you clicking on it, it is REALLY annoying. Please see if there is any way to stop this from happening.

September 15, 2013 9:58 am

Riser,
So you suggest that we tolerate traffic jams and speed limits due to “regulations?” Noise, expense, required skills, many factors dictate that a ground vehicle stays on the ground. Moeller has gotten close, and the FAA does give him trouble, but the requirement for over one thousand HP for VTOL says we probably need an entirely new fuel and power source to ever make these practical. Helicopters are another matter entirely, not safe, very noisy, and difficult to pilot.
Climate “science” is another matter entirely. These people seem to think that they will be able to show us all that the lifestyle of the 18th century will be required so that we do not perish. And why, oh why does anyone care if the polar bears survive or not? Savage, nasty creatures, if it could prevent one baby from starving I would cheerfully consign all of them to extinction…

JPeden
September 15, 2013 10:00 am

Nick Stokes says:
September 15, 2013 at 4:49 am
“That’s hardly relevant to the provenance of this “canonical” equation. But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.”
Right, it’s “hardly relevant”, but then you immediately conjure up the very same Fear Factor Stacey’s question referred to, but which you’ve simply chosen to adopt, when you can’t be seriously worried about It by now – especially as compared to, say, the next time you choose to drive down the road. It’s amazing how some people choose to pass off their neurotic fear of life onto an imaginary Satanic Force which they also think they can avoid if they can just Force the rest of us to do what they want us to do, in order to appease It = You, eh Nick?

September 15, 2013 10:01 am

Bart:
At September 15, 2013 at 9:49 am you quote my having said

Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth.

and you comment

I wouldn’t be too insistent on that point. It is an at-least somewhat chaotic system, and may randomly switch from one trajectory to another. That they all run hot is the key damning point.

Actually, both points are key in that they each demonstrate the models don’t do what they are purported to do.
But the individuality of the models is the more important point. It also means that averaging outputs of different models is inadmissible because average wrong is wrong .
Richard

Pamela Gray
September 15, 2013 10:01 am

Richard I draw your attention to AR4 chapter 8. Their description of climate models appears to differ from both yours and Willis’. It may be useful to read it and to speculate on what this section may look like in AR5.
http://ipcc.ch/publications_and_data/ar4/wg1/en/faq-8-1.html
http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-chapter8.pdf

September 15, 2013 10:10 am

Pamela Gray:
I write to say I fail to understand your comment at September 15, 2013 at 10:01 am.
It says

Richard I draw your attention to AR4 chapter 8. Their description of climate models appears to differ from both yours and Willis’. It may be useful to read it and to speculate on what this section may look like in AR5.
http://ipcc.ch/publications_and_data/ar4/wg1/en/faq-8-1.html
http://www.ipcc.ch/pdf/assessment-report/ar4/wg1/ar4-wg1-chapter8.pdf

Please state the difference you think exists because I’m darned if I know what it is.
Richard

RC Saumarez
September 15, 2013 10:24 am

Having revisited the “cold equations” , I have to agree with Pamela Gray.
All your equations are is a linearisation of the SB effect, which results in a simple ARMA difference equation, This can rougly emulate teperature, as is well known and has been pointed out by many workers, including Mann, McIntyre.
The point of a GCM is that it encapsulates mechanisms. If A happens the B will follow that will precipitate C and so on. This means that in such a model parameters that relate to details of climate can be examined for their effects in the future (Note: I am not saying that they do this with any particular accuracy).
Therefore I think you are wrong in your assertion that your “model” performs as well as GCM. Your model isn’t a model – it is a curve fitting exercise, while GCMs attempt to capture mechanisms.
As an aside, if you are dealing with distributions of variables, as you discuss in you “cold Equations”, it seems to me that your mathematics does not capture this.

September 15, 2013 10:34 am

RC Saumarez:
In your post at September 15, 2013 at 10:24 am you say

The point of a GCM is that it encapsulates mechanisms. If A happens the B will follow that will precipitate C and so on. This means that in such a model parameters that relate to details of climate can be examined for their effects in the future (Note: I am not saying that they do this with any particular accuracy).

You think a GCM “encapsulates mechanisms”?
If you really think that then you have been duped. All, yes, ALL the major climate mechanisms are fudged or contain parametrisations in the GCMs.
Each GCM is nothing more than a gigantic curve fitting exercise. And none of them fit well: e,g, a good temperature fit provides a poor precipitation fit, a good … etc.
Richard

Pamela Gray
September 15, 2013 10:37 am

Willis here is what I said,
“Here is a model of someone who wants what they want now and if they don’t get it, someone is wrong. You want someone to project what you want for breakfast. You want them to be right the first time. And you want it served at 6:00 AM sharp. If it doesn’t happen, the breakfast provider is wrong and they should lose their job.”
If you want to, place yourself in “someone”‘s shoes in my vignette. I didn’t do that but be my guest. I suppose I should have used the pronoun phrase, “A person” instead of “you” so you wouldn’t get your knickers in a twist. But the phrasing of the vignette would have been torturous. Goodness you are sensitive to criticism.
I stand by my vignette. Many here want the models to be right. Some want them to be right related to oceanic-atmospheric oscillations (that’s me along with some others). Some want them to be right related to solar drivers (quite a few). And many want them to be right from the beginning and all along the way. But because they are not, all manner of tar and feathering is the right way to proceed. Yet if we had done that in the past, so many things we now enjoy, because they EVENTUALLY got it right, would not be our daily pleasure.
So I stand by my suggestion and what I actually search for and read. Models are here to stay, they are complicated, and we have a loooong way yet to go. Sooooo, how should they be improved, and is there anybody out there focused on that right now, and what have they published on it? The model discrepancies are fascinating to me. The two steps back fascinate me. Which leads me to also deliciously wonder, who will be in on this chapter of AR5 and who will not be included this time around? What weaknesses identified in AR4 have grown less of a concern and which ones are bigger? Which ones were researched in this interim and which ones were ignored? Why?
So many questions, so little time. But I do love following the story! If folks, as is implicitly and sometimes explicitly, suggested here many times in comments and sometimes in posts, chuck the baby out with the bathwater, my hobby will be far less interesting to me!

September 15, 2013 10:47 am

Pamela Gray says: September 15, 2013 at 8:33 am
“But in reality, Science just seems to work this way. It takes too long and wastes too much money before they get it right. Yes it makes us mad and we want to throw the bums out. Good thing we weren’t successful in the past.”
_______________
Pamela, I reject your above apologia. You try to ignore the facts, specifically the incompetent and disgraceful behaviour of the global warming alarmist camp. For example:
The fatal flaws in the climate models (GCM’s) regarding fabrication of aerosol data were pointed out more than a decade ago and should have been dealt with at that time. Instead, these flaws have been deliberately ignored to this day. As a result, the GCM’s employed by the IPCC are worse than useless. (ref. Courtney R.S., “An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre”, Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).
The existence of the Medieval Warming Period (MWP) was deliberately suppressed by the global warming alarmists and the climate skeptics who demonstrated the valid existence of the MWP were attacked and slandered. In the April 2003 issue of Energy and Environment, Willie Soon and Sallie Baliunas of the Harvard-Smithsonian Center for Astrophysics and co-authors wrote a review of over 250 research papers that concluded that the Medieval Warm Period and Little Ice Age were true climatic anomalies with world-wide imprints – contradicting Mann’s hockey stick and undermining the basis of the Kyoto Protocol. Soon and Baliunas were then attacked in EOS, the journal of the American Geophysical Union.
In the July 2003 issue of GSA Today, University of Ottawa geology professor Jan Veizer and Israeli astrophysicist Nir Shaviv concluded that temperatures over the past 500 million years correlate with changes in cosmic ray intensity as Earth moves in and out of the spiral arms of the Milky Way. The geologic record showed no correlation between atmospheric CO2 concentrations and temperatures, even though prehistoric CO2 levels were often many times today’s levels. Veizer and Shaviv also received “special attention” from EOS.
The response of the global warmist gang was thuggish and imbecilic – they deliberately ignored all criticism, declared “the science is settled”, intimidated the editors of climate journals, and viciously attacked scientists who honestly pointed out the obvious flaws of their catastrophic global warming hypothesis. Global warming acolytes send death threats to climate skeptics, and some skeptics were victims of actual violence. The global warmist gang is akin to a “cargo cult” religion – they have clearly failed to pursue an honest, objective quest for scientific truth.
At a minimum, the warmist gang have systematically misled the people and their governments, damaged or destroyed the academic careers of their betters, and squandered over a trillion dollars of scarce global resources.
To be clear, honest, competent science does NOT “seem to work this way”.

Pamela Gray
September 15, 2013 10:51 am

Richard, you said, “Of course, GCMs are numerical models with the purpose of hindcasting and forecasting climate. But Willis’ model is mathematical (n.b. not numerical)…”
In the frequently asked questions linked document about AR4 issued by the IPCC we read that:
“Climate models are mathematical representations of the climate system, expressed as computer codes and run on powerful computers. One source of confidence in models comes from the fact that model fundamentals are based on established physical laws, such as conservation of mass, energy and momentum, along with a wealth of observations.”
Could you clarify your description of GCMs being numerical?

Pamela Gray
September 15, 2013 10:53 am

Folks, since this is a discussion on the models, I found it useful for me to re-visit what the AR4 said about them. I linked upstream to a Q and A as well as the actual chapter. You might find it useful too?

James Strom
September 15, 2013 10:59 am

Willis Eschenbach says:
September 15, 2013 at 7:39 am
Thanks Willis. I must have read most of these posts, Big whoops if you’ve already answered the question. I look forward to reading or rereading.
J

September 15, 2013 11:17 am

Michael Moon says: September 15, 2013 at 9:58 am
“…if it could prevent one baby from starving…”
Here is a relevant comment from 2009, when global temperatures had temporarily declined, arguably to ~1940 levels:
http://wattsupwiththat.com/2009/05/06/dealing-with-climate-change-in-the-context-of-other-more-urgent-threats-to-human-and-environmental-well-being/#comment-128055
stumpy (21:27:41) :
The money spent on Kyoto IN A SINGLE YEAR is sufficient to bring clean water and sanitation to every person on earth AND OPERATE THESE SYSTEMS FOREVER; these two factors alone would massively extend the lives of those in the third world and considerably reduce deaths, particularly infant.
***********************************************
If i recall correctly, the source for this statement was Bjorn Lomborg, several years ago, at the time of his first Copenhagen Consensus.
I’ve added significant corrections in CAPS.
Good comments Stumpy – thank you.
***********************************************
To put this issue into perspective, in the decades that we have been obsessed with the false crisis of Global Warming, as many as 50 million children below the age of five have died worldwide from contaminated water – equal to ALL the people who died in the Second World War.
Catastrophic Humanmade Global Warming is the BIG LIE of our time, and speaking the truth on this issue is an ethical and professional obligation.
I think we know enough from the satellite and surface data to state that Earth’s climate is insensitive to recent increases in atmospheric CO2. There has been no net global warming since 1940 – a full PDO cycle – in spite of an 800% increase in humanmade CO2 emissions.
We do not even know for certain that humanmade emissions are the cause of increased atmospheric CO2. We do know that at time scales ranging from years to hundreds of thousands of years, CO2 trends LAG, do NOT lead, temperature.
We also know that the only significant measured impact of increased atmospheric CO2 concentrations is increased plant growth and drought resistance.
Furthermore, in all probability a slightly warmer world would reduce human mortality, not increase it.
These are my honest opinions, based on several decades of study.
Regards to all, Allan

Lars P
September 15, 2013 11:24 am

“So … what is the most plausible explanation for this ludicrous, abysmal failure to improve a simple estimate in a third of a century?
I can give you my answer. The models are on the wrong path. And when you’re on the wrong path, it doesn’t matter how big you are or how complex you are or how fast you are—you won’t get the right answer.
And what is the wrong path?
The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere. The canonical (incorrect) equation is:…

You are spot on.
And if I correctly remember Salby showed what the true relationship between temperature and CO2 is.
However, in my humble opinion, I see another issue too:
Not only the models – the data too. What is the quality of the data?
If one uses models to “estimate” or “prepare” the data (what other reason can be for continuously cooling the past and warming the present?) they introduce the error in the data that they input to the models.
With such data prepared for the input:
http://stevengoddard.wordpress.com/data-tampering-at-ushcngiss/
Going backwards, a biased model can backcast close to that data, but will not be of any use to forecast, what we continuously see happening.
The more data tampering, the more inaccurate the historical record, the worst the performance of the models based on that data will be.
So they do not have a way of finding out what the real drivers are.

September 15, 2013 11:25 am

Pamela Gray:
At September 15, 2013 at 10:51 am you ask me

Could you clarify your description of GCMs being numerical?

I answer, yes.
Oh, and I will include a clarification when discussing the difference between
(a) my description of GCMs
and
(b) the AR4 Chapter 8 description of GCMs
which you claimed exists (in your post at September 15, 2013 at 10:01 am).
I am awaiting your answer to my request for information on whatever you think that difference is because I requested that information from from you at September 15, 2013 at 10:10 am and Willis also asked you for it at September 15, 2013 at 10:13 am.
Richard

Pamela Gray
September 15, 2013 11:25 am

Allan, Science is a bloody sport. Lots of casualties of the white and black hat wearing practitioners. Always has been. I couldn’t handle it. Way too much backdoor wheeling and dealing. I was a weak one-hit wonder then quit. But I’m older now and I let stuff roll off my back so that I have more time to fish and hunt. But it sure is fun to watch science from the seats.
I watched a very good facsimile of it this morning while on my coffee-in-hand stroll about the farm I am visiting. The bulls were having a major conflagration. They have been at it every morning. Newcomers were added to the pasture and an old king-of-the-hill bull is loosing ground. So everybody was fighting with everybody. There isn’t a better way to do this folks. Sometimes your prize bull gets hurt and you have to start all over again. That’s the way it goes. Sometimes the worst of them gets weeded out right from the beginning and everybody claps and gets to have the not-so-good bull over a BBQ pit. Eventually though a lead bull proves himself to be most capable of getting the herd pregnant, and impregnates most of the herd. And all the while entertaining the heck out of a little redheaded country girl. That is why I love the current state of climate modeling and am not too upset about the way it is going. Yes, it costs me tax dollar money watching them make mistakes or duke it out. But I still like it. Some say a little too much. (Don’t ask me which “game” I am referring to above. Too close to call.)
So sometimes I am going to question everybody and every thing regardless of which side of the fence a post sits. Sometimes I will add my tar and feathering efforts to the group effort. Sometimes I get to pump my fist in the air cuz I was right. And sometimes the score is 4 to 1 and I have to concede. Now that’s fun right there!
But it is still just an [admittedly expensive and waste-filled publicly supported arm of research] arm-chair adventure for a lot of us and so I don’t take myself as seriously as others do here. However, I do try…and have to remind myself…to think and reason as much as I can while having fun.
Meanwhile, back at the post…

September 15, 2013 11:30 am

I don’t like the cruise control analogy either. Obviously if you set the cruise control for 60mph it will burn more gas than if you set it for 40mph. So, speed is still closely tied to gas use. And if the warmists were stuck with the cruise control analogy, they’d say the more CO2 we emit than the higher will be the “MPH” cruise setting on earth’s temperature.
So, I would think CO2 in a car would be more akin to oil or paint, not gas. Clearly gas is the main “driver” of the car’s motion. So the cruise control analogy suggests that CO2 for the climate is like gas for the car, that CO2 is in fact the main and virtually sole driver of the climate. That’s ludicrous, if you consider the evidence, that for this 17 year temperature stall out that we’ve seen temperature & CO2 ae “decoupled” (CO2 rising fast, temps flat), and in the low CO2 first half of the 20th century and the higher CO2 second half of the 20th century the rate of temperature increase appears to be virtually identical, despite a much higher CO2 level in the later part of the 20th century… so here CO2 is also “decoupled” from temperature, there is no discernible signal for CO2… !
Finally, we know that there is evidence that temps affect CO2 levels, but there is no actual evidence (only a probably faulty theoretical model) that CO2 affects temperature: http://www.youtube.com/watch?v=WK_WyvfcJyg&CO2Lag

DirkH
September 15, 2013 11:31 am

“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
Origin of life.

MrX
September 15, 2013 11:33 am

Nick Stokes says:
September 15, 2013 at 4:49 am
But there are about 3000 gigatons of CO2 in the atmosphere now, and we’re emitting over 30 gigatons a year. That emission is rapidly increasing.
————————–
Assuming you’re correct, man made CO2 has a 5 year half life. So after 5 years, you’d have about 120 gT of CO2 in the atmosphere assuming a linear half life which it probably isn’t. This amounts to 4% of the total CO2 in the atmosphere as is being reported. But the amount won’t get higher than that even if we continue pumping CO2 into the atmosphere at 30gT per year because of the 5 year half life. It will stay identical. Rising production by humans won’t make it budge more than about 1.5 gT total forever per gT increase per year.
Yet the total CO2 is still climbing. You need CO2 that has an average halflife of 30 years for this to happen. So the rising CO2 is already known to not be from human sources.

RC Saumarez
September 15, 2013 11:33 am

At Richard Courtney,
I am well aware of parameterisation in mathematical models. You have an equation which is an abstraction of a physical process. This is parameterised. I have done this many times in a different type of model.
For, example, The NCAR Cam 3 model contains, among other things, a thermodynamic model of sea ice. This is based on physical assumptions. (They may not be correct). This allows the formation of sea ice to be predicted through defined physical mechanisms.
This is a completely different intellectual process to Eschenbach’s difference equation.

Pamela Gray
September 15, 2013 11:38 am

Richard, I have pointed out that AR4 describes them as mathematical. They attempt to dynamically reproduce climate processes using code. They use the term “mathematical” to describe the code. But you describe them as numerical. Why do you think the code is numerical? As I said, there appears to be either a difference in understanding of the code, the terms or an equivocal use of the two terms. So what do you mean when you use the term “numerical”? What we may be having is a discussion of terms that neither of us have used correctly or that AR4 has used correctly. I don’t know. You tell me. That is why I asked for a clarification. There appears to be a difference here and I wanted to know why and your reasoning.
From what I have read about the ENSO models, the dynamical models certainly use mathematical code strings to represent physics based climate processes and connections as they understand them to be.

Latitude
September 15, 2013 11:40 am

Lars P says:
September 15, 2013 at 11:24 am
========
100%…
They diddled with the past temp record to make it more scary than it was…
….secondary to trying to create an accurate enough temp record from trees, pollen, and ice etc
a 1/2 degree
No wonder all the computer games showed a higher increase in temps than what really happened..
…they can’t even get a flat trend line correct
The computer games will never be, can never be…and are all doomed laughing stock failure
…until they admit they’ve lied about the historical temp record
What are the odds of that happening?

Latitude
September 15, 2013 11:41 am

Assuming you’re correct, man made CO2 has a 5 year half life….
absolute garbage

RC Saumarez
September 15, 2013 11:42 am

At Willis Eschenbach.
Thank you for your rather inteperate reply.
I have in the past been forced to write a post on WUWT to correct the complete rubbish you wrote about signal processing – a subject that you have not studied in any depth.
I am well aware of the nature of a “black box” model, although those of us who are more familiar with the subject might regard it as a set of transfer functions, describing functions or differential equations.
What you clearly fail to recognise is that when ypu write a differential equation, you are making a statement about the physical structure of the system in questions. This brings into question observaibity and behaviour under a generalised set of variations.
I can see that I will have to write another post to educate you on basic systems analysis.

September 15, 2013 11:43 am

I don’t see governing factors invalidating ∆T = lambda ∆F. Instead, I see extra negative feedbacks reducing the value of lambda, or causing it to decrease as ∆F increases. Even Dr. Roy Spencer, a big name scientist on the skeptical side, sees validity of ∆T = lambda ∆F. One thing he has done is attempting determinations/estimates of lambda, and comes up with lower values than IPCC favors.
One negative feedback that I see increasing as greenhouse gas concentration increases is the lapse rate feedback. A positive feedback that I see decreasing as the world warms is the surface albedo feedback – seasonal snow and ice will retreat to places and times-of-year where/when there is less sunlight affected by further change of snow and ice, or in an extreme case be largely gone.
I do see how peak tropical ocean surface temperatures are largely regulated by factors reducing the climate sensitivity there, such as big blowups of convection from ocean getting hot to the (greenhouse-gas-cooled) top of the troposphere. I don’t agree with completely governed. For example, if solar output makes a major long term change, or the distance between Earth and the sun changes, then peak tropical ocean surface temperature will change.

Pamela Gray
September 15, 2013 11:56 am

Here is a simple explanation of dynamical mathematical models versus statistical models. Most if not all GCM models are of the dynamical kind, meaning they would be considered to be mathematical.
http://iri.columbia.edu/climate/ENSO/background/prediction.html#types

RC Saumarez
September 15, 2013 11:57 am

@Richard Courtney(2)
I find your comments as regards mathematical models and “numerical models” absolutely extraordinary.
Numerical models are used to solve a set of equations that cannot be solved analytically. If you look at NCAR Cam 3 for example, there are numerous physical assumptions that are cast in a differential form (usually). Since there isn’t a hope of solving these analytically, the equations are approximated numerically.
There is a long discussion about the framework in which the equations describing physical processes such as diffusion, advection in different coordinates, evaporation etc., are solved numerically to give a stable solution (hopefully) at defined time steps.
Let me give you a simple(r) example. I solve equations governing cardiac propagation. This is theoretically a 3dimensional cable equation. However, the ionic currents are dominated by highly non-linear equations that depend on local potential. I can write the basic 3d equations for this but there is no prospect of solving them analytically. This can be done through standard techniques.
Note. The model is physical. Diffusion, charge flow and capacitance are assumed from experimental data and basic theory. The ionic current flow equations are semi-empirical, based on observation and some speculation about how they are controlled. The solution is numerical, which is based on a large body of maths of how you approximate differential equations and solve them iteriively.
I think Pamela Gray is dead right and you and Eschenbach are wrong.

September 15, 2013 11:59 am

RC Saumarez:
Thankyou for your reply to me at September 15, 2013 at 11:33 am.
I copy it here in full to save others needing to find it.

I am well aware of parameterisation in mathematical models. You have an equation which is an abstraction of a physical process. This is parameterised. I have done this many times in a different type of model.
For, example, The NCAR Cam 3 model contains, among other things, a thermodynamic model of sea ice. This is based on physical assumptions. (They may not be correct). This allows the formation of sea ice to be predicted through defined physical mechanisms.
This is a completely different intellectual process to Eschenbach’s difference equation.

Yes, it IS “a completely different intellectual process to Eschenbach’s difference equation”.
But that was never in dispute.
You claimed in your post at September 15, 2013 at 10:24 am you say

The point of a GCM is that it encapsulates mechanisms. If A happens the B will follow that will precipitate C and so on. This means that in such a model parameters that relate to details of climate can be examined for their effects in the future (Note: I am not saying that they do this with any particular accuracy).

I replied saying at September 15, 2013 at 10:34 am

You think a GCM “encapsulates mechanisms”?
If you really think that then you have been duped. All, yes, ALL the major climate mechanisms are fudged or contain parametrisations in the GCMs.

YOUR RESPONSE IS TO CHANGE THE SUBJECT!
And a parametrisation is a guess. It may be an educated guess, but it is only a guess. You admit this when you say,
“This is based on physical assumptions. (They may not be correct).”
Willis’ model does not include any guesses.
So, as you say, parametrised GCMs use a completely different intellectual process to Eschenbach’s difference equation. The difference is that “to Eschenbach’s difference equation” is not based on guesses.
Richard

Brian H
September 15, 2013 12:00 pm

Joe Dunfee says:
September 15, 2013 at 9:55 am
The video ad started playing automatically. When an ad starts playing audio without you clicking on it, it is REALLY annoying. Please see if there is any way to stop this from happening.

I haven’t seen an ad, much less a (commercial) video, on this (or almost any other site) for years.
1. Use your system settings to turn off Autoplay.
2. If using a browser that supports Add-ons, use AdBlock Plus or similar. I use FireFox, partly for the wealth of Addons it supports. I also highly recommend its TabMix and Lazarus addons. And NoScript. And UnMHT. And Download Helper. And Ghostery.

Pamela Gray
September 15, 2013 12:07 pm

These links provide information on a piece of the code in climate models used to calculate radiation. The first link describes early efforts on a string of code used to calculate the radiation process within a climate model. The second link describes the current efforts to work on this section of code. The code appears to me to be mathematical calculations of a climate dynamic.
http://www.gfdl.noaa.gov/bibliography/related_files/rge9101.pdf
http://onlinelibrary.wiley.com/doi/10.1029/90JD01618/abstract

September 15, 2013 12:07 pm

RC Saumarez:
Your post at September 15, 2013 at 11:57 am begins saying

@Richard Courtney(2)
I find your comments as regards mathematical models and “numerical models” absolutely extraordinary.

And I find your posts both offensive and completely ignorant of the subject.
You make vague assertions interspersed with blatant errors. And when those errors are pointed out you change the subject. See my answer to you at September 15, 2013 at 11:59 am.
Post something sensible or choose to clear off.
Richard

Pamela Gray
September 15, 2013 12:09 pm

Hey Joe! The same thing has been happening to me! Twice this morning! It just started happening today on this thread. Weird.

bones
September 15, 2013 12:20 pm

You asked: “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
String theory would have to be at the top of the list. Like CAGW theory, to its adherents it is unfalsifiable.

Pamela Gray
September 15, 2013 12:20 pm

RC Saumarez I must ask, is your work related to the hunt for how to fix a heart without replacing it, and/or how to create a heart that is better at being a normal heart?

September 15, 2013 12:21 pm

Pamela Gray:
I am replying to your ridiculous post at September 15, 2013 at 11:38 am.
1.
You said IPCC AR4 Chapter 8 disagreed with my and Willis’ description of GCMs and set me the homework off reading the entire chapter to try to find that difference.
2.
I said you needed to tell me what the difference is.
3.
Your post I am answering says to me

They use the term “mathematical” to describe the code. But you describe them as numerical. Why do you think the code is numerical?

That’s it? One word? And you wanted me to search the entire chapter to find it!
Pamela Gray, I don’t know what you are doing in this thread, but it does not seem to be constructive.
Firstly, I did NOT talk about the “code”: I did not mention it.
I talked about the models.
The GCMs use finite difference analysis to iterate to a stable solution. That is a numerical model obtaining a numerical solution. The fact that the models are coded with mathematics does not change that.
Richard

Pamela Gray
September 15, 2013 12:25 pm

And I must add, you are a busy man/woman! Is your background primarily in math, medical science, or both? I know of research teams who specifically include mathematicians because of the research being done creating/with models. So I say again, you have been busy!

September 15, 2013 12:33 pm

It’s been my understanding that a “governor” as used in motor vehicles is meant to cap the top speed, not to maintain a consistent, user-defined speed. Maybe it’s just a difference in usage.

September 15, 2013 12:33 pm

“You see, back around 1980, about 33 years ago, we got the first estimate from the computer models of the “equilibrium climate sensitivity” (ECS). This is the estimate of how much the world will warm if CO2 doubles. At that time, the range was said to be from 1.5° to 4.5°.”
Actually there is a longer history of the number than that. If you want a real fun lesson in the history of science look at estimates of the speed of light or things like the electron charge.
A good history was started here
http://bartonpaullevenson.com/ClimateSensitivity.html
I would make it a WUWT reference page
One also has to distinguish methods. Most climate scientists dont see models as the best source of estimates. The best evidence is observational..

papertiger
September 15, 2013 12:43 pm

wws says:
September 15, 2013 at 7:12 am
in response to “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
Oh, I can think of quite a few! UFOlogy, Sasquatchology, Phrenology, the Pernicious Evil of Chemtrails, and Who Was on the Grassy Knoll?
Steven Goddard seems pretty convinced that there was somebody on the Grassy Knoll.
Because there’s no way on Earth an ex-Marine could shoot a politician’s pumpkin sized head, travelling at parade speed, from 90 yards away, on a clear day, with no wind, only given three shots. /sarcasm tag
cross tagged / pointing out nominal allies’ stupidities

CRS, DrPH
September 15, 2013 12:48 pm

Beware of the “climate automatons” and their latest hope, ocean acidification:
“Or have you ALL morphed into climate automatons?”
http://seattletimes.com/html/localnews/2021826582_westneat15xml.html

…as reporter Craig Welch is documenting in this week’s Seattle Times: The global warming debate has become a sideshow anyway. Ocean acidification, global warming’s evil twin, is caused by the same culprit, is easily measurable and is already a crisis.
There’s not much to debate so far: We’re poisoning the oceans with acid. So will we do anything about this one?

cd
September 15, 2013 12:52 pm

The wrong path is the ludicrous idea that the change in global temperature is a simple function of the change in the “forcings”, which is climatespeak for the amount of downward radiation at the top of the atmosphere
As far as I know many statistical model make that assumption. But GCMs are little more complicated.
Perhaps I’m wrong, but what is suggest here is not how GCM work. The GCMs start out with an initial forcing, probably derived from some radiative transfer model and then modify other parts of the system after this initial perturbation. They make a series of assumptions then run the model which has a voluminous component – the cells. The models are run time and time again with different assumptions in order to determine how much a change in an individual “feedback” is required to reproduce the given historical record all the while being conditioned by physical laws (as it permeates through each cellular element). They can, at least with the coupled ocean-atmospheric GCMs, account for some natural variation as result of ocean thermal inversion.
I think Dr Chris Essex has an excellent video on their limitations and why they don’t work. And I don’t think the current review is nuanced or detailed enough to account for why they’re crap.

September 15, 2013 12:56 pm

cd:
At September 15, 2013 at 12:52 pm you say of GCMs

I think Dr Chris Essex has an excellent video on their limitations and why they don’t work. And I don’t think the current review is nuanced or detailed enough to account for why they’re crap.

Please state the review to which you are referring.
Richard

Pamela Gray
September 15, 2013 12:58 pm

I think the issue here is that some folks here may be interpreting climate models to be a set of mathematical calculations (short or long) that are all a priori calculations, IE they run by the numbers. To the best of my knowledge, and correct me if I misinterpret:
The GCM (global circulation model) code itself cannot be described as a priori calculations of numeric pre-set sequences. They are dynamical GCM’s simulations (all done with mathematical calculations and based on current understanding of natural physics based processes) that include a suite of randomizing intrinsic and extrinsic natural variables (because of weather and oscillation variations) and that are driven with and without a priori numerical input (scenarios) as many times as you have grant money for. It is even possible, I think but don’t know for sure, to input randomizing variables if you believe your scenarios are themselves dynamical.
The results are the multiple spaghetti graphs of all the runs with an average of the runs somewhere in the middle. This is then graphed as angled sets of ever-widening error bands the further out the simulations are allowed to run from starting conditions. Why the ever widening error bands? That’s a phenomenon of natural variation known to exist regarding natural current conditions predicting future climate. The code itself is thus dynamical. The outcomes become scenarios (and then given model version names) when driven with numerical a prior input (IE CO2 concentration increase from baseline at 1% per year, etc).
So I think Richard maybe is partially correct? GCM models used to produce CO2 scenarios are both numerical and “mathematical” if I am using his understanding of the terms correctly. I prefer to use a priori to distingish between a natural run from a forced a prior run. But then I am just an armchair enthusiast watching the game, not playing it. So what do I know.

Pamela Gray
September 15, 2013 1:00 pm

a prior”i”. Jeesh Pam. Get it right.

John F. Hultquist
September 15, 2013 1:05 pm

I just checked with “the Team” and 97% had to look up the meaning of “following wind.” They thought you were referring to their normal strategy of passing gas.

RC Saumarez
September 15, 2013 1:07 pm

@Pamela Gray.
I assure you that my goal is not to break hearts!
I have involved in the prediction of sudden cardiac death. This has involved extensive measurements made within patients’ hearts (which is surprisingly safe) and interpretation of the results throgh signal processing and mathematical modelling.
The object is to identify patients who need implantable cardio-verter defibrillators IICDs), the issue being that the majority of these these devices never deliver therapy, so exposing patients to risk of complications at considerable cost and no benefit.

September 15, 2013 1:07 pm

Pamela Gray:
In your post at you write September 15, 2013 at 12:58 pm

So I think Richard maybe is partially correct?

What you “think” does not alter the fact that I am completely correct in what I have written in this thread.
The problem is that you clearly do not understand the code is not the model: the code determines how the model will operate.
You need to look up finite difference analysis (FDA) and finite element analysis (FEA).
You may then desist from your mud slinging.
Richard

cd
September 15, 2013 1:08 pm

Richard

Pamela Gray
September 15, 2013 1:08 pm

Richard, it was actually on the first page of the first link not too far from the first word. I figured you would read it. When someone sends me a link, I read it. But that’s me. Sorry if I suggested something onerous to you.

Tom
September 15, 2013 1:12 pm

“Can anyone name any other scientific field that has made so little progress in the last third of a century? ”
—————
As a couple others have mentioned, physics’s String “Theory” fits the bill … but it’s not really a “theory” , “hypothesis” is more accurate. “Supersymmetry” is not far behind. Not a shred of physical evidence for either one (just mathematics), and a lot of evidence against. Many proponents have spent their entire careers on these subjects, and they twist & contort logic such that they’ve made String “Theory” essentially unfalsiable. Little green aliens that disappear whenever you turn around can also be made unfalsiable.
The only reasons the authoritarian leftist & politicans have not jumped on the empty String Theory bandwagon is, they can’t figure out how to use “Strings” to:
— raise taxes by Trillions of $ .
— increase the size, power, and control of central gov’t .
— profit from money-making scams & rent-seeking.

September 15, 2013 1:17 pm

cd:
You have posted a 52 minute video and addressed it to me.
If you have something to say then say it. I have no intention of watching for 52 minutes merely because something has appeared on the web.
Richard

Pamela Gray
September 15, 2013 1:19 pm

Up through the leg I assume.

rogerknights
September 15, 2013 1:24 pm

Pamela Gray says:
September 15, 2013 at 12:09 pm
Hey Joe! The same thing [a spoken ad starting automatically] has been happening to me! Twice this morning! It just started happening today on this thread. Weird.

Me too. I don’t think it’s a browser problem. I think WordPress has fiddled with something.

September 15, 2013 1:24 pm

Pamela Gray:
You crossed the line with Willis up thread. And you have now crossed it with me with your post at September 15, 2013 at 1:08 pm.
You set me the task of finding an unspecified difference between what I said and you claimed was in an entire IPCC Chapter. Willis told you why that was misbehaviour up thread at September 15, 2013 at 10:13 am.
And I would not have found anything because what you mistakenly thought was a difference was ONE WORD which YOU DID NOT UNDERSTAND.
And you try to excuse that!?
I have had enough of you. Post something sensible or choose to clear off.
Richard

September 15, 2013 1:25 pm

Joe Dunfee says:
September 15, 2013 at 9:55 am
The video ad started playing automatically. When an ad starts playing audio without you clicking on it, it is REALLY annoying. Please see if there is any way to stop this from happening.
===================================================================
Ad Block, and tip Anthony now and again.
Sorted. And I gather that even IE has Ad Block now.

RC Saumarez
September 15, 2013 1:25 pm

@Richard Courtney.
I decline to be told to clear off by someone as rude as yourself. I would rather make a proper argument.
a) Mathematical models of physical processes involve a set of equations that are based on physical principles. The contain extrinisic variables such as mass, energy temperature, etc. The parameterisation involves assigning constants to tge system involved, actual;mass, surface areas etc. The parameterisation depends on the model in question and how they vary.. Sometimes it can be highly specified, in others, where the model is simplified it will be less accurate
b) If one write an equation using, say conservation of momentum and and advection, one is maling a definite physical statement as to how the system works. This captures processes within the system. For example, ice-atmosphere interactions in GCMs.
c) I do not regard Eschenbach’s model as a model. It simply assumes a first order process through linearisation and is thus a parameter fitting exercise. If it were not, one might have considerable reservations about its parameterisation.
d) The purpose of a GCM is to attempt to answer the question: we know the state of the system at the present, what will happen next. They try to achieve this result mechanistically. This cannot be approach by a model such as Eschenbach’s.
e) Your statement:
“The GCMs use finite difference analysis to iterate to a stable solution. That is a numerical model obtaining a numerical solution. The fact that the models are coded with mathematics does not change that.” makes no sense to me. The models are assembled using mathematical physics. The finite difference equations on the mesh are constructed to conform the equations of the underlying physics.
Apart from being so bloody rude to everyone and writing in mold italics, do ypu actually:
1) Have any expertise?
2) Have a point.

September 15, 2013 1:26 pm

Awaiting moderation? Have I offended?

cd
September 15, 2013 1:26 pm

Richard
I don’t care whether you watch it or not.
Please state the review to which you are referring.
How do you propose I state the review – despite being grammatically incorrect – I never stated he gave a review I said there was a good video where pointed out why GCM’s are limited.
I’ll assume that your rather pompous, terse and indifferent response was due to your – apparent – poor verbal reasoning.

Jeff L
September 15, 2013 1:31 pm

So, if this hypothesis is correct, the real question is what control’s the level(s) of the governor ??
… it could also be that this branch of study hasn’t progressed simply because there is way too much politics being done & not enough science

RC Saumarez
September 15, 2013 1:31 pm

@ Willis Eschenbch.
http://wattsupwiththat.com/?s=saumarez
I rather enjoyed reading your posts about the UK.
However, as you said in a comment on my post above, you are a self-taught mathemetician.
If you put your head above the parapit and start pontificating on subjects that you know very little about, you should not be surprised when individuals who have more experience than yourself call you out.
R

September 15, 2013 1:33 pm

RC Saumarez:
At September 15, 2013 at 1:25 pm you ask if I have any “expertise”. Well, I clearly have much more than you but – as this thread shows – that is not saying much.
Richard

September 15, 2013 1:38 pm

cd:
At September 15, 2013 at 1:26 pm is boorish.
You told there was a review and I asked you what it was.
In another post you provided a 52 minute video and addressed it to me with no other information.
As to my grammar, you provided no grammar because you said nothing.
As I said, if you have a point to make then make it.
Richard

RC Saumarez
September 15, 2013 1:38 pm

@Richard Courtney
Oh really? Please Justify. Degrees? Publications?

cd
September 15, 2013 1:44 pm

RC Saumarez
I wouldn’t bother with him, he is either a troll or is just too pompous to converse; I guess monologue is how usually communicates.
I agree with many of the points you make. I think you should, if you’re interested, look at the video I linked to in one of the comments to Richard. It is an excellent presentation by Dr C. Essex and highlights probably the most fundamental issues with the current batch of GCMs – and that cannot be overcome by ever more efficient code or programming paradigms. As a far as I know GCM modellers are at the forefront of parallel programming for sequential algorithms but when the “quality bottleneck” is hardware you’re fighting the wrong war. The main issues is resolution, and up-scaling manifolds in order to solve the Navier-Stokes equations. But i was shocked to see how they deal with numerical instabilities. I can’t believe all these leaps in the design and implementation of code were made without looking at these fundamental issues – without addressing those outlined by Essex they’re always going to be crap!

September 15, 2013 1:45 pm

RC Saumarez:
This thread is not about me and I see no reason to answer any personal questions, especially when you could have had an answer by reading references mentioned in the thread.
And that is my last post addressed to you. I despise trolls.
Richard

RC Saumarez
September 15, 2013 1:45 pm

@RichardSCourtney.
Are you the Richard S Courtney whom the Desmogblog is rather rude about?,
DPhil Cambridge? Does Cambridge grant a DPhil?
If I have mis-identified you, please make this clear, because having scanned RichardsCoirtney’s publications on the web, I cannot understand how you can claim to be an authority on mathematical modellin.
Ig you are not this RichardsCourtney, I offer my humble apologies.

September 15, 2013 1:48 pm

cd:
I see you have resorted to the usual anonymous troll tactic of poisoning the well.
And tag team trolling, too.
Richard

RC Saumarez
September 15, 2013 1:50 pm

To Anthony Watts,
cc: Willis Eschenbach.
I am very happy to write a post to discuss model identification. You will remember that after I thought that W’s post on signal processing lacked a certain understanding of the subject, je told me to put up or shut up.
You invited me to write a post, which I did. I went out of my way to be inoffensive in the post, but a number of readers with a signal processing background got the point.
If you would like me to do so again, I would of course be deligheted to do so.

cd
September 15, 2013 1:58 pm

RC Saumarez
Desmogblog
My opinion of you has just crashed. Anyone who gets their opinions, or seeks information from such a poisonous website about anybody, deserves disdain for they seek it for others.

Pamela Gray
September 15, 2013 2:05 pm

Richard, you said, “You need to look up finite difference analysis (FDA) and finite element analysis (FEA).” These types of mathematical processes do form part of the code strings in climate models. Is that bad or good in your estimation?

RC Saumarez
September 15, 2013 2:08 pm

Here are some of Eschenbach’s responses during this thread
Actually, the problem with fusion is not the science—it’s the engineering. The clim*ate question is purely theoretical. But the fusion challenge is entirely physical. We’re trying to cage the sun in a bottle … and as a result, the lack of progress there is quite different from the lack of progress in narrowing the bounds on climate sensitivity.
w.
For heavens sake, Pamela … are you drunk-blogging or something? Your fantasies about what I want are a joke—they have nothing at all to do with me. You’ve totally misunderstood my post, you still don’t seem to have grasped what my model does, and now you are simply wallowing in your sick ideas about what I want and who I am. I’m not who you think I am, not by a thousand miles.
I won’t hold it against you, but for goodness sakes, next time leave all of that kind of pathetic, obsessive personal stuff out entirely and confine yourself to the science. You’re just embarrassing yourself with that puerile nonsense.
w.
Pamela, when you respond to me and say “you must be” this and “you want” that, that’s not a “vignette”.
That’s an accusation about me, and in this case a very ugly and unpleasant accusation that had nothing to do with me.
So I’m sorry, but your “explanation” doesn’t hold water. An apology is in order, not a justification of your unwarranted attack.
w.
PS—Citing an entire Chapter of an IPCC report? Is that your idea of a proper citation? My high-school chemistry teacher would have thumped me with her red pencil if I tried that nonsense. If you have a point you wish to back up, you need to cite chapter and verse.
As it stands, you’re no better than the Bible-thumpers of my childhood, who would stand up in the tent and when someone asked a question would hold up the Bible and shout “The answer’s right here” … perhaps the answer is somewhere in the entire chapter you just cited, but I’m not going to try to guess just which paragraph you’re talking about.
RC Saumarez says:
September 15, 2013 at 10:24 am
Having revisited the “cold equations” , I have to agree with Pamela Gray.
Here we go again with vague claims … you have to agree with Pamela Gray saying WHAT?
All your equations are is a linearisation of the SB effect, which results in a simple ARMA difference equation, This can rougly emulate teperature, as is well known and has been pointed out by many workers, including Mann, McIntyre.
The point of a GCM is that it encapsulates mechanisms. If A happens the B will follow that will precipitate C and so on. This means that in such a model parameters that relate to details of climate can be examined for their effects in the future (Note: I am not saying that they do this with any particular accuracy).
You (and Pamela) seem to have totally misunderstood the idea of a “black box analysis”. Here are three posts that will give you a better idea what I’m talking about.
Zero Point Three Times the Forcing
Life is Like a Black Box of Chocolates
Climate Model Sensitivity Calculated Directly
Therefore I think you are wrong in your assertion that your “model” performs as well as GCM. Your model isn’t a model – it is a curve fitting exercise, while GCMs attempt to capture mechanisms.
Ummm … come back after you’ve read the three posts, and you’ve understood the idea of a “black box analysis”. At present, we’re talking on entirely different levels, and that doesn’t work.
As an aside, if you are dealing with distributions of variables, as you discuss in you “cold Equations”, it seems to me that your mathematics does not capture this.
I’m sorry, but could you be a bit more vague? That’s almost meaningful …
w.
Your specious claim is that in the first paragraph the “you” clearly refers to me, but in the second paragraph it doesn’t?
Hogwash.
Clearly, you are referring to me all the way through, and now, rather than apologize as a decent human would, you’re trying to weasel out of it with that bogus excuse?
You don’t seem to get it, and yes, I do mean “you”. Here’s a protip—that’s why people use the word “you”, because they mean the person they are addressing. Otherwise they say “him”, or “someone”, or “her”, because “you” means … well … you.
You have insulted me, whether deliberately or not, and now you want to justify it on some bogus excuse, that it would have hard to phrase your “vignette” so that it wouldn’t be insulting … really?
PS—if you think that a “black box model” is just a set of transfer functions, you’ve missed the purpose of the black box analysis entirely. The purpose is to understand what the system in question is doing functionally. In this case, my analysis showed that the models are simply lagging and resizing the inputs. This is an important finding, as it builds on and completely explains Kiehl’s observation that models with high forcings have low sensitivities.
And all of your handwaving about transfer functions and differential equations and how wrong I am doesn’t change those findings at all. RC, you haven’t shown that one single thing that I found in my black box analysis of the models is wrong … and that seems to be bothering you a lot, if your agitation is any gauge.
I realise that you have free reign on this blog to post anything you like and you have special license to replay to anybody who displeases you.
Nevertheless, there are people who comment on this blog who have considerably more scientfic education and experience than your good self.
I am unable to undersatnd what makes you think that you habe any authority to make the pronouncements that you do.

September 15, 2013 2:13 pm

RC Saumarez says:
September 15, 2013 at 1:50 pm
You invited me to write a post, which I did. I went out of my way to be inoffensive in the post, but a number of readers with a signal processing background got the point.

If you have to go out of your way to be inoffensive, then you have a problem.

RC Saumarez
September 15, 2013 2:18 pm

@CD,
I am not saying that Desmogblog’s opinion is worth having.
I am trying to identify Richard S Courtney. I asked him if he had any degrees or any publications that would back up his assertions. If he is the one that is pin-pointed there, one can find his papers. Those that I can find do not point to any particular expertise in the solution of partial differential; equations.
As, I said, if he is the wrong Richard S Courtney, I apologise profusely. He is at perfect liberty to refute this

Luther Wu
September 15, 2013 2:19 pm

rogerknights says:
September 15, 2013 at 1:24 pm
Pamela Gray says:
September 15, 2013 at 12:09 pm
Hey Joe! The same thing [a spoken ad starting automatically] has been happening to me! Twice this morning! It just started happening today on this thread. Weird.
Me too. I don’t think it’s a browser problem. I think WordPress has fiddled with something.
__________________
Yes, something.

cd
September 15, 2013 2:19 pm

RC Saumarez says:
September 15, 2013 at 2:08 pm
Where are you going with all this. You seem a little disgruntled to say the least; and seem more interested at times in discrediting the man rather than arguments.
1) Is it because you feel entitled to peoples’ agreement?
2) That the given review is wrong? Is the premise wrong or are the arguments wrong?
3) You’re here to sabotage the thread?

cd
September 15, 2013 2:23 pm

RC Saumarez
You have no right to demand thta anyone disclose who they are. If you have done so then more fool you.
BTW, qualifications have nothing to with it. I have ample amount of qualifications to argue on most scientific issues but that doesn’t make right – it’s my arguments that do. In short, what do his qualifications, or who he is, have to do with anything. He’s made his case that is all you need here!

Luther Wu
September 15, 2013 2:27 pm

@ RC Saumarez,
I’m really unsure of your point. If WIllis demonstrated quite some time ago, that the models’ outputs reduce to a simple ‘black box’ equation, either you see that his analysis is valid, or disagree and can prove it.

RC Saumarez
September 15, 2013 2:28 pm

@Jeff Alberts.
I suggest you read what W has posted in response to people.
The background to this dispute is that I have a PhD from a signal processing lab and can distinguish something that is sensible from something that is not.
W produced a rather strange post on filtering and correlation and I commented on it. I received a diatribe of the nature shown in my post above in which I was told to put up or shut up. I was invited to write a post, which I did. In this I was careful not to throw stones at Mr Eschenbach but to be objective an informative.
My point is this.
WUWT that is a premier science blog and some very good people write posts on it. Mr Eschenbach writes extensively on mathematical modelling, signal processing and many other topics. Those of us who have expertise in these fields make some critical comments that would be regarded as polite, but effictive, in academia and we are met with a volley of abuse.
My personal opinion as a PhD with some experience is that Eschenbachs’s posts on modelling and signal processing are naive. This stems from a lack of training and experience in the subject that he posts on.

September 15, 2013 2:28 pm

Pamela Gray:
I said you had crossed the line, and I had had enough of answering you. But in the light of the RC Saumarez troll’s attacks I suppose I must make some response to your post at September 15, 2013 at 2:05 pm which asks me

Richard, you said, “You need to look up finite difference analysis (FDA) and finite element analysis (FEA).” These types of mathematical processes do form part of the code strings in climate models. Is that bad or good in your estimation?

Oh dear, that is not even wrong
http://en.wikipedia.org/wiki/Not_even_wrong
The model is constructed as a framework and iterates to stability at its nodes.
As I said, you need to look up FDA to understand what the models are doing.
The code determines what occurs in the model.
FDA is neither good or bad. There was a time when I used FEA (a similar model method to FDA) for stress analysis. It is a method. Formulate a good model and it enables analyses of surprising accuracy, precision and reliability. Do it wrong and the result is either nonsense or fails to achieve stability.
So, your question as to whether the models using FDA being good or bad does not have an answer (it is like asking the name of the Pope’s wife) because it depends on the model, how it was formulated, and the processes coded into it. If a process is incorrectly formulated or is not included then it cannot be known if the effect of that is significant or not because nobody can know what the model actually does in achieving stability through its iterations (just as the name cannot be known of a person who is not known to exist).
Richard

RC Saumarez
September 15, 2013 2:30 pm

@ Luther Wu.
I would be delighted to do so if I am invited.

September 15, 2013 2:37 pm

cd:
It seems I owe you an apology.
You made a comment which I understood to be your supporting the behaviour of RC Saumarez.
Your subsequent posts demonstrate that I was mistaken in that understansding.
I completely apologise for my erroneous assumption, resulting accusation, and any offence thus caused.
Richard

RC Saumarez
September 15, 2013 2:38 pm

@Richard S Courtney
It’s been a long time since I’ve used finite element analysis, but I seem to remember that this is based on variational principles. Finite difference methood is based on a Maclaurin’s/Taylor’s series.
Please correct me if I’ve misunderstood this.
Your humble troll

RC Saumarez
September 15, 2013 2:40 pm

@ Willis Eschenbach,
If Anthony Watts invites me to write a post, i will do so,

Latitude
September 15, 2013 2:40 pm

Latitude says:
September 15, 2013 at 11:41 am
absolute garbage
======
by refusing to address when something is limiting……people can get away with saying things that are ridiculous

RC Saumarez
September 15, 2013 2:47 pm

@cd.
I am certainly not here to sabotage anything.
I do not think that the reasoning is correct.
I see no point in continuing with this rather unpleasant thread. I apologise if I have given offense.

cd
September 15, 2013 2:51 pm

richardscourtney says:
September 15, 2013 at 2:37 pm
I actually agree with a lot of RC’s earlier points. I’m not sure where all these sat with the above article. It seems to be going a bit off topic now and has become a bit of a “slagging match”. It’s a pity as I’m intrigued with GCMs and how they’ve managed to carry the authority of Gospel.

Pamela Gray
September 15, 2013 2:52 pm

A quick review of who I am: I am less than 5 ft tall. Freckled. Almost full-blooded Irish. Long unruly light red hair that I don’t really know what to do with. Solidly built and a pretty good shot. Arm chair climate and weather hobbyist. Close to retirement. Love to fish and hunt. One BS and two Masters plus administrative coursework beyond that. Very few publications and presentations, none on climate science. One was original research on the auditory brainstem response to generated sudden-onset frequency specific tones presented rapidly and resulted in an averaged synaptic brainwave signal minus random brainwave noise, and published in a well-known journal. Educator by trade. Currently downsizing to sub teaching so I can hunt and fish whenever I choose. I have no doubt that Richard has more widely spoken on this topic than I have. Willis as well.
That said, I have at times, to myself and rarely in a comment, questioned their tone and mannerly responses to myself or other commentators, and I have more often questioned as well their theories. I believe that is part and parcel of this blog and the owner’s willingness to allow mostly unmoderated comments. I will continue to question science and proposed theories, sometimes being snarky and sometimes being serious, but rarely if ever will I be testy or rude. Testiness and rudeness speaks for itself, draws attention to itself, and judges itself, for better or for worse, without help from me.

September 15, 2013 2:53 pm

Friends:
Now the arrogant troll has withdrawn, I will say did get a trivial point right about the difference between FDA and FEA (perhaps he did a google). As I said, they are similar. They are not the same. But the difference is not important to this thread or what I wrote.
Richard

cd
September 15, 2013 2:57 pm

RC Saumarez says:
September 15, 2013 at 2:47 pm
I see no point in continuing with this rather unpleasant thread. I apologise if I have given offense.
I actually agree with a lot of your earlier points and you certainly didn’t cause me any offense. But I think Willis has a point as some of your posts do tend to lack “reference” to the actual points in the article. Anyway, you seem to be a nice chap but managed to have been drawn into a bit of a conflict – which in fairness is a bit of your own making as these things often are.
I would still encourage you to watch the link to Dr C Essex’s presentation.

RC Saumarez
September 15, 2013 2:57 pm

@cd
your 1.44 post. I agree with the second part. I won’t comment on the first.

RC Saumarez
September 15, 2013 2:59 pm

@ Richard S Courtney.
I can assure that I did not google the difference. Read my papers!

September 15, 2013 2:59 pm

cd:
At September 15, 2013 at 2:51 pm you say

I actually agree with a lot of RC’s earlier points

Assuming they are pertinent to the thread, it would be helpful if you were to say what they were.
This because I only noted two points about models that he made (excluding his parting shot) and I refuted both. The second was a change of subject when he was shown to be wrong about the first. After that he turned nasty.
Richard

cd
September 15, 2013 3:05 pm

RC Saumarez says:
September 15, 2013 at 2:57 pm
I won’t comment on the first.
I’m not even sure who I was talking about there – but it wasn’t you. It was addressed to you!

DirkH
September 15, 2013 3:10 pm

Willis Eschenbach says:
September 15, 2013 at 11:37 am
” “Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
Origin of life.
Craig Venter reports that he and his team are closing in on artificial life … ”
I know.
So he wants to prove that there was a creator.
Dawkins must already be having hissy fits.

DirkH
September 15, 2013 3:13 pm

Willis, what I was referring to was not the engineering advances of modern biochemistry, but the search for a mechanistic explanation for life from nothing in the absence of a creator, whose existence counts as invalid deus ex machina trick for the mechanistic scientists.

Pamela Gray
September 15, 2013 3:21 pm

I find this article an interesting one. Richard it mentions that, “Some AGCMs use finite-difference methods…” Is this in disagreement with what you know about models? There are many such articles. Are you saying that models don’t use this method and should? Sorry I’m just not clear on your beliefs about these mathematical methods as they apply to climate research.
http://kiwi.atmos.colostate.edu/pubs/CISE.pdf
Note: look on the first page, 5th paragraph.

AndyG55
September 15, 2013 3:23 pm

The “governor” is the atmospheric pressure gradient.

Pamela Gray
September 15, 2013 3:24 pm

Willis, in my opinion, the very fact that one step forward and two steps back has been made is just what should happen given the disparity between the models and observations. It seems to me, though too slow for many of us, that the scientific process of understanding a complex climate is proceeding as it should?
[Pamela, I’m still waiting for an apology from you. -w.]

Pamela Gray
September 15, 2013 3:26 pm

Folks, it is 99 degrees here in NE Oregon. Me and my big sis (rofl- she is only 5 ft 2 in) are headed for the redneck “pool” outback with cold beer at the ready. So I am off line for a little while.

September 15, 2013 3:29 pm

Pamela Gray:
re your post at September 15, 2013 at 3:21 pm.
I don’t have “beliefs” about the models. I have understandings. There is a big difference.
There are dozens of models and they differ. This – as I explained above – is why it is an error to average their outputs.
Richard

Luther Wu
September 15, 2013 3:39 pm

RC Saumarez says:
September 15, 2013 at 2:28 pm
“…”
______________________
I will go back and see if I can find that post, as I don’t remember seeing your information at the time, thank you.

DirkH
September 15, 2013 3:40 pm

Willis Eschenbach says:
September 15, 2013 at 3:27 pm
“I fear, however, that your proposition that there is an invisible creator is by definition not falsifiable … since no one can falsify the existence of an invisible being.
I mean, that’s what “invisible” means, no way to photograph, measure, weigh, or otherwise perceive such a creature …so being unable to find him/her/it means nothing.””
How is a Higgs Boson “found”; by wading through millions of noisy datagrams and then saying, yeah, there’s something at 100 MEv that fits our theory.
Similarly the existence of the invisible being can normally only be inferred. I say ‘normally’ because religious people hold that there are wonders; and epiphanies, which should be measurable as deviations from the mechanistic scenario if they exist and can be observed. That would be the invisible being showing itself through its meddling with its creations.
One interesting thing I found:
Complexity extrapolations estimate the age of life to be twice the age of the planet.
http://en.wikipedia.org/wiki/Origin_of_life#Coenzyme_world
Something that is by now unexplained.

Jim Clarke
September 15, 2013 3:54 pm

Haven’t read all the comments, but mine is simple. Nail – Head
The point is that the models are entirely based on the initial assumption that feed backs are positive. Since that is wrong, as the atmosphere is showing us, no amount of computer power, crunch time or intimidation will produce the right answer with that assumption. The feed backs are governing feed backs, that steer us away from warming tipping points. The proof of that lies in the fact that the Earth has never, in 4 billion years, experienced run away global warming. Never!
We knew this from the very beginning
The IPCC may be lowering their projections, but they are still clinging to the wrong assumptions about feed backs. Consequently, we still don’t have much to cheer about scientifically.

Luther Wu
September 15, 2013 4:07 pm

Willis Eschenbach says:
September 15, 2013 at 3:27 pm
“I fear, however, that your proposition that there is an invisible creator is by definition not falsifiable … since no one can falsify the existence of an invisible being. I mean, that’s what “invisible” means, no way to photograph, measure, weigh, or otherwise perceive such a creature …so being unable to find him/her/it means nothing… and because of that, you’re way outside the scope of my question…”
_______________________
Willis, I’m commenting on words you just lightly touched upon, not your actual commentary…
It is trendy among many (typically) on the left to criticize other peoples’ ideas of God as an invisible figure, usually referred to as being “in the sky”. Such comments are often repeated as original, everywhere within the ‘progressive’ world online.
I’m not here to discuss God, but rather get in a few words about the devil, who I am convinced does not exist, since I’ve been all over hell, with no sign of ‘im so far.

DirkH
September 15, 2013 4:22 pm

Luther Wu says:
September 15, 2013 at 4:07 pm
“I’m not here to discuss God, but rather get in a few words about the devil, who I am convinced does not exist, since I’ve been all over hell, with no sign of ‘im so far.”
If you examine Anselm Of Canterbury’s / Gödel’s ontological proof, you will, interestingly enough, find that it holds for God but not for the devil.
http://en.wikipedia.org/wiki/G%C3%B6del%27s_ontological_proof
(Gödel’s proof has just been validated for logical consistency by a theorem prover program at the Free University of Berlin; that’s why I stumbled across it.)

TimTheToolMan
September 15, 2013 4:22 pm

RC Saumarez writes “The solution is numerical, which is based on a large body of maths of how you approximate differential equations and solve them iteriively.”
For a single model run, perhaps. As soon as you parameterise a process, you’re enforcing a curve fit because a parameterised process cant change without changing the parameter which, as I said, is a curve fit. What do you think the complex modelled processes converge to when averaged over many runs?
A. The complex equations?
B. The underlying fundamental assumption which in the climate case is that temperature is determined by forcing with a lag?

davidmhoffer
September 15, 2013 4:24 pm

Luther Wu;
I’ve been all over hell, with no sign of ‘im so far.
>>>>>>
Ah well, perhaps s/he is just avoiding you? More importantly, can you confirm if hell is exothermic or endothermic?
http://www.pinetree.net/humor/thermodynamics.html
(this thread needs a bit of humour injected into it in my opinion)

papiertigre
September 15, 2013 4:28 pm

I think it needs to be said, this is wonderful news. The IPCC are regressing.
Th e question should have been “Can anyone name any other scientific field that has moved backward in the last third of a century?”

Nick Stokes
September 15, 2013 4:49 pm

TimTheToolMan says: September 15, 2013 at 4:22 pm
“B. The underlying fundamental assumption which in the climate case is that temperature is determined by forcing with a lag?”

Who made that assumption? Where? Again, no quotes.
It’s true that if you average over time, average over space, Newton’s law of cooling tends to emerge. That’s Willis’ “canonical equation”. But the averaging removed a lot of complexity. And if you average over models (as Willis did), a lot more is lost.
Jim Clarke says: September 15, 2013 at 3:54 pm
“The point is that the models are entirely based on the initial assumption that feed backs are positive.”

Again, no quotes or cites. Models do not make assumptions about the sign of feedback.

TimTheToolMan
September 15, 2013 4:55 pm

Nick writes “Who made that assumption? Where? Again, no quotes.”
Are you asking where in the climate community somebody decided we’ll just model forcing with a lag? Then the answer is nobody explicitely made that assumption. However that doesn’t alter the fact that the models when averaged over many runs, appear to model forcing with a lag.
Are you suggesting the correlation with the forcing with lag equation is purely coincidental? Or perhaps simply a fit? Because its not entirely a fit, is it. Its very much in the ballpark of what the models could be simplified to.

bit chilly
September 15, 2013 4:56 pm

i have to say until this conversation thread i have seen nothing but a group of very smart people apply critical thinking and open minds to others comments. as a lay person the clear and concise way commentators like richard s courtney,willis, allan mcrae (really enjoyed your informative comments today) and many others explain technical subjects to enable basic understanding to people like myself.
on this comment thread however i believe i have seen an example of the worst sort of comment ,the type i had thought was reserved for places like sks ,by rc saumarez. cd alluded that he may have had a point somewhere,and i was looking forward to a discussion whereby his objections to willis reasoning/method would be raised ,backed up by his own informed position on the subject, to which willis could either refute or be in a position to add something to his knowledge base.
unfortunately all i got was rc saying willis was wrong because he said so,with no highlighting of how or why.
richard s courtney did add some clarity to the subject,which hopefully confirmed my knowledge light assumption that it was apples and pears being argued as opposed to apples and apples,accept my apologies if i am incorrect richard s courtney.
as one of a growing band of “new” sceptics i would advise rc saumarez that many ordinary citizens are sick to the back teeth of being “told” what to believe,without any actual facts to back the assertion those doing the telling are correct.
i believe this is how the whole sorry mess of cAGW has arrived at the mess we have today,by politicians accepting what they have been told by scientists that refuse to accept the possibility they are wrong .
rc saumarez,you may well have a very valid point,i personally do not know.but if your responses on this discussion thread are anything to go by,i will never know ,and more importantly neither will the people that you have addressed your critique to. at the very least ,when asked to demonstrate how and why,you should have obliged.

John Andrews
September 15, 2013 5:13 pm

Bingo!

Nick Stokes
September 15, 2013 5:14 pm

TimTheToolMan says: September 15, 2013 at 4:55 pm
“However that doesn’t alter the fact that the models when averaged over many runs, appear to model forcing with a lag.
Are you suggesting the correlation with the forcing with lag equation is purely coincidental?”

No, as I said, it’s basically Newton’s Law of cooling (whose ludicrous idea?). Hot things emit more heat. Things that are heated get hotter.
But the Earth, too, globally averaged, responds to forcing with a lag. eg Lucia’s lumpy, or Tamino’s two-box model.

ed mister jones
September 15, 2013 5:14 pm

thingadonta says:
September 15, 2013 at 6:08 am
“I just watched a documentary on the Concordia cruise line disaster. Despite the latest state of the art technology, the most sophisticated navigation systems, despite a crew of around 1000, the cruise liner crashed into a well known and well charted outcrop of rock marked on any standard chart of the area.
The reason? Entirely human error. The captain took the ship deliberately off course . . .”
IOW, he defeated all of the mechanisms in place to prevent the undesired outcome . . . like an airplane pilot that ignores (for whatever reason) “Terrain” Warnings. I suppose It would be quite a project to enumerate EACH of the Scientific Method processes and failure prevention methods that have been deliberately defeated/ignored by those obsessed with selective perception of climate reality.

ed mister jones
September 15, 2013 5:17 pm

Bit Chilly.
A sublimely constructive takedown.

Luther Wu
September 15, 2013 5:18 pm

bit chilly says:
September 15, 2013 at 4:56 pm
“…as one of a growing band of “new” sceptics i would advise rc saumarez that many ordinary citizens are sick to the back teeth of being “told” what to believe,without any actual facts to back the assertion those doing the telling are correct.”
________________________
Bravo
————————————-
davidmhoffer says:
September 15, 2013 at 4:24 pm
Luther Wu;
I’ve been all over hell, with no sign of ‘im so far.
>>>>>>
Ah well, perhaps s/he is just avoiding you? More importantly, can you confirm if hell is exothermic or endothermic?
________________________
Well, hell’s probably exothermic, since every time you turn around, either hell’s a poppin’ or all hell’s breakin’ loose.
On the other hand, there is also evidence that the opposite is true: http://www.funnysigns.net/hell-freezes-over/

TimTheToolMan
September 15, 2013 5:22 pm

Nick Stokes writes “But the Earth, too, globally averaged, responds to forcing with a lag. eg Lucia’s lumpy, or Tamino’s two-box model.”
And if the sun were putting out more energy then this would be valid reasoning. But its not like that is it. Instead it is theorised that the CO2 reorganises the energy such that the surface is warmer and that is not valid reasoning without further proof. When the models can be shown to be ignoring all that complexity then there are a couple of options.
A. The earth also ignores all the complexity.
B. The models are wrong.
Given the recent divergence of observed vs measured temperatures I think its becoming clear which is correct.

TimTheToolMan
September 15, 2013 5:26 pm

And of course when I say “observed vs measured temperatures” I mean observed vs modelled temperatures.

September 15, 2013 5:28 pm

Willis Eschenbach says: September 15, 2013 at 11:34 am
Or are you just nitpicking?
w.
>>>>>> >>>>>> >>
>>>>>> >>>>>> >> >>
Willis, your work is great!! Sorry to nitpick. My bad. And I get it about a self-regulated / self-modulating system. That makes sense. Either the established physics is bull and climate sensitivity is zero or ~ 0, or something else is at play that we don’t understand. But, yes, CO2 is not doing what the warmists say it should be doing. It’s doing… nothing. That’s the evidence.

Paul Monaghan
September 15, 2013 5:49 pm

Willis, love your contributions and always look forward to them. But I am with Bloke Down the Pub regarding the relationship between applied fuel and speed. I may just be missing your point, but when reading the article initially I had the same response as Bloke and went to the comments looking to see if anyone else did, and the Bloke (thanks Bloke!) stated it well.
Cruise control merely does more or less automatically what humans try to do manually when attempting to maintain a given speed regardless of road conditions, in both cases by throttle manipulation. But I’m sure you know that. So I think you were making a different point. Could you please explain more clearly how the relationship between speed and fuel consumption gets “uncoupled” with a governor?
Many thanks,
Paul Monaghan
Connecticut, USA

RC Saumarez
September 15, 2013 5:51 pm

I’m sorry if I have come over as arrogant and didactic.
This was not my intention.
If you look at this post, there is, IMHO, a arrogant thread which amounts to bullyng. Pamela Gray does not deserve the responses that she has received. In normal academic exchanges, this would be unacceptable.
As regards my position. I spent about 24 hours reading the NCAR Cam 3.0 document. It has been assembled by very high grade mathematicians and scientists. I got a general idea of their approach. However, I would think that it would take anyone from a standing start at least a year to come to grips with the numerics of the model. Also, there is a huge amount of physical modelling that makes up the underlying mathematics of the model. I freely admit that I cannot come to grips with this modelling. Although I have worked in some aspects of modelling, this is way beyond my competence.
I think that off the cuff remarks by the coimmentariat on mathematical modelling here are are absurd.
I have been met with a volley of abuse and been called a troll. Mr Eschenbach, who by his own admission, is a self taught mathematician, has defended his position against reasobale criticism with abuse to most his critics, rather than making an argument. I am not saying that he is wrong, bui if he makes a proposition, he should defend it with logic. I agree I have been somewhat dismissive of his forays into signal processing, but there are a number of people who read this this blog who are educated in the subject and do not accept his line line of reasoning.
I have made some points about modelling. I freely admit that I am not primarily a mathematical modeller. I am a measurer. However I used non-linear stress analysis in bone, which was closely related to variational principles i.e.: FEM during my PhD and have used Finite difference methods to solve highly non-linear PDEs to try to explain experimental observations in cardiac electrophysiology. (I do not claim great originality in this approach)
Nevertheless I have been told that I am an ignorant troll and been told to go away. I suggest that this is not the case. I have challenged my most vehement critic to tell us what papers he has published in order to establish his expertise in the subject of mathematical modelling.
I questioned his statement that FE methods were equivalent to finite difference methods because they were based on differet mathematical principles. His response was that I was correct but I must have looked it up on Google, although I have published results using these methods.
I am happy to write a post on why I think that Eschenbachs’ thesis can be critisised on several grounds. I have suggested this several times but have bit received a response. If I am invited to do so, I will.
I repeat my comment that there are some serious scientists who write posts on this blog, which is possibly why it has its reputation.

Resourceguy
September 15, 2013 5:57 pm

This makes perfect tactical sense for any agency with a keen eye on the funding and prospects for much more. That is to acknowledge the truth of actual data to some extent but to keep up the rhetoric for main funding prize at the same time. It may well be the last of its kind if the cooling continues. They will have to light fires under city-based land temperature stations beyond that point and shoot down the satellites.

Paul Monaghan
September 15, 2013 5:58 pm

P.S., funny little twist, I imagine slick conditions could be a “positive feedback” leading to “instability” in the system a la another post I read (here?) recently, in the sense that increased acceleration leads to tire slippage and lower speeds, calling for additional increased acceleration, until you’re just doing careening along the ice at max RPM until the motor throws a rod through the block (which I guess would be the ultimate negative feedback…). This could of course be accounted for by referencing wheel RPMs to speed in the programming, and I imagine it is, but the thought made me smile.
Best,
Paul

September 15, 2013 6:00 pm

I’m still reading the comments after having read the comments on a very similar thread at Bishop Hill. I am surprised at the number of people who (still) think the projected warming (climate sensitivity) is a property of the climate models.
Everyone should read Richard Courtney’s excellent and lucid explanations of what the climate models do, or rather what they don’t do.
What follows from this is that it doesn’t matter how much the models are improved (in the sense the physics are better understood and modelled and grid resolution improved), they will never produce better climate predictions, because the (deterministic) models don’t materially contribute to the predictions, and it is deeply dishonest of the climate modellers et al, to pretend better models (more money spent) will produce better predictions.

Adam
September 15, 2013 6:00 pm

Similar advances have been made In the field of random noise prediction.

Greg
September 15, 2013 6:01 pm

Thousands of years ago, it was important to know whether the next harvest was going to succeed especially after years of drought. The pharaoh relied on the high priests for advice. They would slaughter chickens and examine their entrails to determine whether the the next harvest was would be a success. However the pharaoh was displeased at the lack of success of their predictions. He challenged the priests and said, the chickens can’t predict anything and had no forecasting skill. The priests replied, fearing their jobs, and heads were on the line, said, “Not so sire, what we really need is more chickens”.

Gene L
September 15, 2013 6:21 pm

It would seem to me that the models would be more convincing if they were not focused on glocal average temps. I can barely accept that this can be determined with any degree of accuracy today, much less 100 years from now. If their models are so wonderful shouldn’t they be able to publish a forecast for the daily temps at O’Hare, or Sidney for the next hundred years. We would know how trustworthy the model is in a few months. No need to burden our greatgrandchildren. Think how beneficial it would be to society if I knew if I should buy a new snowblower in Nov. rather than waiting till Jan. The best NOAA, with billions of dollars of compute power can do is above average, normal or below average, and only a few months out at that. And I haven’t found that to be especially helpful.
http://www.cpc.ncep.noaa.gov/products/predictions/long_range/lead01/off01_temp.gif
The AGW models must be accurate to hundredths of a degree if they can forecast average global temps to a tenth. Their own diagrams show they model thousands of points around the globe. Surely they must feel some of these forecasts are actually accurate.
So what’s the temp for 1/1/14 at O’Hare? Is that too much to ask?

Michael
September 15, 2013 6:23 pm

DirkH says:
September 15, 2013 at 11:31 am
“Can anyone name any other scientific field that has made so little progress in the last third of a century? Anyone? Because I can’t.”
On human scale climate science has made enormous jumps- on Earth scale- its a pittance.
Geosciences have also made pitiful progress- still only barely penetrating the crust in small area- trivial progress- much smaller than climate science. The understanding of the Earth is trivial- anyone who understands remote sensing knows it totally dependant on “knowing” what you are measuring to measure it.
Astronomy-Cosmology- progress like atom in the entire Earth.

commieBob
September 15, 2013 6:24 pm

Willis, I seldom disagree with anything you say. That said, your cruise control analogy isn’t terribly good.

The addition of a governor completely wipes out that linear relationship, de-coupling the changes in gas consumption from the speed changes entirely.

All other things being equal, if I set the cruise control at 50 mph I will get 30 mpg, if I set it at 70 mph I will get 20 mpg. If I change the speed, the gas mileage changes.

A governor is quite different from a feedback. A governor uses feedback to control a system … but the governor itself is not a feedback.

The system here is a car operating in a certain environment. The input is the desired speed. The output is the actual speed. The cruise control takes the desired speed, subtracts the actual speed and generates an error signal which actuates the throttle. In other words, the cruise control provides the signal that controls the throttle position.
The feedback signal is the vehicle’s speed. The feedback loop consists of the speed sensor and the cruise control. http://en.wikipedia.org/wiki/Control_theory#An_example The whole process is called feedback. http://en.wikipedia.org/wiki/Feedback

Latitude
September 15, 2013 6:24 pm

Philip Bradley says:
September 15, 2013 at 6:00 pm
====
100%

mike
September 15, 2013 6:25 pm

Pamela,
You know, I really like that “vignette” deal of yours. Good stuff!, despite all the criticisms you’ve received on this blog. In fact, I’m so enthused about the whole thing that I’ve even tried my own hand at a “vignette”. Not as good as yours though:
MY FIRST VIGNETTE
“It’s like you’re a self-abosrbed, preening, heroine-of-your-own-story, slightly wearisome scold who specializes in doofus, pompous, school-marmish, baroque put-downs, ostentatiously free of the slightest intemperance of language–a Pecksniff, authoritarian, bully style perfectly suited to crushing any school-kid charge who might exhibit the slightest independence of thought (not that you’d ever do such a thing!). But a “trick” that doesn’t work so well when played on mature adults with abundant life-experience who can “sass” back with impunity. That, and I don’t really believe your speech, except when you’re putting on a little act, is really as “salt-free” as you make out.”
So what did you think of my first try at a vignette, Pamela? And, of course, the “you” in my vignette is not a “you”-you. I mean, like, I’m not talking about you, Pamela Gray, specifically, or anything. Rather, it’s a generalized “you”–you know, like in your own up-thread vignette and all.

Luther Wu
September 15, 2013 6:33 pm

“I am happy to write a post on why I think that Eschenbachs’ thesis can be critisised on several grounds. I have suggested this several times but have bit received a response. If I am invited to do so, I will.’
____________________
Talk is cheap. Catch up.
I’ve asked, Willis has asked, fairly sure others have leaned in that direction. WTH. Are you shooting for an invite or an incite…

Pamela Gray
September 15, 2013 6:39 pm

No harm done to me I assure all. I am very much a learner and not an expert so have no thesis to defend. And as such I am very capable of looking past emotional responses to get to the science, even if I have to wear boots. The science intrigues me and I will leave no stone unturned to examine claims made here as well as in journals.
As to the bullying, it speaks for itself and I offer no comment on it. Meanwhile I continue my studies of climate modeling and am encouraged by the tinkerers out there who are engaged in this practice and writing articles about it in peer reviewed journals.
Here is a hint about searching behind paywalls for articles. If you cut and past the name of the article immediately followed by lowercase pdf in your search engine’s window, you can often find the researcher’s personal copy posted publicly.

davidmhoffer
September 15, 2013 7:09 pm

Nick Stokes;
Models do not make assumptions about the sign of feedback.
>>>>>>>>>>>>>>>>>
Yet they all seem to produce sensitivity estimates greater than the theoretical direct effects of CO2 increases. That being the case, there can be no other logical conclusion than the models are predicated on assumptions of positive feedback.

Nick Stokes
September 15, 2013 7:09 pm

TimTheToolMan says: September 15, 2013 at 5:22 pm
” When the models can be shown to be ignoring all that complexity then there are a couple of options.
A. The earth also ignores all the complexity.
B. The models are wrong.”
Nobody has shown that GCM’s are ignoring complexity. Here is just one example of the complexity they are not ignoring.
What has been shown is that if you globally average, and then average over long time periods, a simpler pattern emerges. So it does with Earth data. That doesn’t mean that the Earth (or GCM’s) are ignoring complexity. It means you are. For good measure, Willis averages over models too.

Alex Heyworth
September 15, 2013 7:11 pm

R C Saumarez wrote
I am happy to write a post on why I think that Eschenbachs’ thesis can be critisised (sic) on several grounds.
Do it. Then click on the “Submit Story” tab at the top of the page.

KevinM
September 15, 2013 7:12 pm

Richard Courtney,
If you’re still reading, please consider winding your ALL CAPS into a ball and stuffing…
You’ve responded to my two sentence plain text comment with twenty sentences, all caps, bold text, exclamation points, and the incorrect assumption that I believe CAGW expressed with a perjorative term implying my political leaning is opposite what it is.
You sir are a jacka…

davidmhoffer
September 15, 2013 7:15 pm

RC Saumarez;
I am happy to write a post on why I think that Eschenbachs’ thesis can be critisised on several grounds. I have suggested this several times but have bit received a response. If I am invited to do so, I will.
>>>>>>>>>>>>>>>
I for one would like to see your reasoning, so I have to ask, what’s stopping you? Write it up, paste it into the box at the bottom of the screen, click “post comment”. You’ve been invited, in fact challenged, to do so. Many stellar articles by Robert G Brown for example started out as comments in a thread and were elevated to posts.
Seriously, I’d like to see what you have to say on this issue. I’m waiting for you to say it, and there is nothing stopping you from doing so.

JPeden
September 15, 2013 7:17 pm

“bit chilly says:
September 15, 2013 at 4:56 pm”
Yup, I’m not sure why it has suddenly become all the rage to try to sound like a “mainstream” Climate Scientist, i.e., to anoint yourself with omniscience, then try to sound as crazy as possible, meanwhile trying to get the rest of us obey your every command. You’d think that their omniscience would help them to at least try to tell us the answer!

Brian H
September 15, 2013 7:26 pm

willis;
The Essex video is a response by a [theoretician], a mathematician who was in on the early days of climate modelling and was gobsmacked by the outrageous assumptions, claims, and procedures employed, and has had no reason to change his verdict: “You can’t do that!” He analogizes the modellers with the Red Queen, who, with practice, was occasionally able to “believe six impossible things before breakfast.”
He gives slide graphic examples of the real mathematical behaviour of numerical models which are claimed to empirically emulate solutions to closure problems, the Navier-Stokes equations, etc., which mis-attribute stable outputs (due to their inherent limitations) to real world processes.
He is confident (Q&A response) that the current round of determined delusion has about run its course, but certain that human determination to believe some form of nonsense will swiftly replace it. He makes passing reference to the many whose careers and livelihood depend on perpetuating the present “rabbit hole” digging, and will not be deflected to the surface this side of the grave. I suppose a period of competing hare-holes will occur, with a few of the current diggers switching work crews.
cd; forgive my efforts at a précis. This is about my third time through the talk, over many months, and the above is what stuck.

TimTheToolMan
September 15, 2013 7:28 pm

Nick writes “Nobody has shown that GCM’s are ignoring complexity. Here is just one example of the complexity they are not ignoring.”
You think a video of low resolution fluid flow makes your point? They are ignoring complexity in a number of key areas Nick. For example cloud formation is determined by the parameterisation for water vapour saturation and is tweaked to best represent history and keep the model within stable realistic boundaries. Its a fudge.
So if cloud formation in the real world changes over time, how is this represented in the model? Its not.
And if cloud creation isn’t enough, I’d bet that many emergent properties of the climate are similarly parameterised.
For the record I think Willis is off on an irrelevent tangent when he emphasises the governor aspect of his theory. I think its much more important to understand that the climate is composed of emergent properties that effect energy flows and that climate models dont represent those emergent properties properly and so cant represent changes to them either.

TalentKeyHole Mole
September 15, 2013 7:29 pm

Oh Dear.
[Apologies to Woody Guthrie and Johnny Cash.]
How many times,
Must a climate computer code fail,
Before, it can tell us .. yesterday?
The answer .. my friend … is blow’n in the wind
The answer .. is blow’n .. in the wind.
😉

Brian H
September 15, 2013 7:30 pm

edit: theoreticist theoretician

TimTheToolMan
September 15, 2013 7:49 pm

I wrote “I think Willis is off on an irrelevent tangent when he emphasises the governor aspect of his theory.”
Actually thats harsh. I think the governor aspect is important because IMO the climatic processes will respond to any surface forcing to minimise that surface warming. But having said that, its a step by step process and the first step is to describe where the models are going wrong and why.

pyromancer76
September 15, 2013 8:18 pm

Ridicule is so satisfying. Thanks for showing the truth.

September 15, 2013 8:21 pm

Willis, your writing reminds me of why I always loved reading SciAm magazines. The best of those articles always reminded me of poetry. I really enjoy your clear thinking and I thank you for sharing it for all to experience.

David Riser
September 15, 2013 8:31 pm

For those of you wanting to educate yourself about General Circulation Models. The specific forecasting models in use today are listed at the National Hurricane Center’s (NHC) website. They give a rundown of the characteristics of the major forecasting models, which includes the “numerical” or dynamical General Circulation models such as the GFS, UKMET, NOGAPS, CMC, and ECMWF.
The interesting thing is that Willis’s model would be considered a statistical one, NHC uses statistical models to determine if a dynamical or numerical model is skillful. So it follows that the folks who know most about models use a similar system to determine accuracy of prediction. The current set of models are still many orders of magnitude away from being accurate for more than a few days which is actually pretty impressive considering what the models produce.
The models use the current state of the earth for the entry parameters and then are run for a set period of time, every so often a snapshot of the earth is pulled out (3, 6, or 12 hours depending on purpose) This allows us to see what is going on globally. Unfortunately after about 48 they start getting a bit haywire and after 120 they are probably nearly useless. But for the time they work, they have saved quite a few lives in terms of accurate weather forecasting. So I say don’t throw the baby out with the bathwater, we need good models, they just don’t do well in forecasting climate.
The link is below along with a short description pulled out of that link. Also if your wondering if these are really the models IPCC is talking about, the second link will answer that question.
http://www.nhc.noaa.gov/pdf/model_summary_20090724.pdf
http://www.ipcc-data.org/guidelines/pages/gcm_guide.html
“Forecast models vary tremendously in structure and complexity. They can be simple enough to run in a few seconds on an ordinary computer, or complex enough to require a number of hours on a supercomputer. Dynamical models, also known as numerical models, are the most complex and use high-speed computers to solve the physical equations of motion governing the atmosphere. Statistical models, in contrast, do not explicitly consider the physics of the atmosphere but instead are based on historical relationships between storm behavior and storm-specific details such as location and date. Statistical-dynamical models blend both dynamical and statistical techniques by making a forecast based on established historical relationships between storm behavior and atmospheric variables provided by dynamical models.”

David Riser
September 15, 2013 8:45 pm

Oh and for the person who was saying that we don’t have 1000hp for a VTOL car, check out this link:
http://mashable.com/2013/08/28/terrafugia/
They have a pretty nice setup that isn’t vtol which is a power hungry way of going at the problem. we have the power plants for a vtol car but there isn’t a will to make one. You can get a lightweight 1400hp turbine and if you don’t need that much power you can get them smaller. The 1400 turbine was used in a F1 race car. Jaguar did a hybrid with turbines at 700+ that got great gas mileage to boot. I still don’t think you’ll see a flying car any time soon as its just too difficult politically.
v/r,
David Riser

Geoff Sherrington
September 15, 2013 8:46 pm

ick Stokes re weight of CO2 in atmosphere
“The CO2 arithmetic is simple – 500,000 * 400 ppmv, and then a molecular weight calc. Emissions were here; 32.578645 GTons in 2011, if you like a lot of decimals. Burning C costs money, so these numbers come from accountants, not scientists.”
Does not this calculation assume that the number of molecules of CO2 are fixed in proportion to the number of molecules of air at a point?
If so, what causes the heavy gas CO2 to be dragged to the rarefied atmosphere around the tropopause?
I’m thinking that the total weight of CO2 in the atmosphere is not simple to calculate, nor is the amount of man-made in a dynamic system.