Believing In Six Impossible Things Before Breakfast, And Climate Models

Dr. Chris Essex: Why Computers Cannot Reproduce The Climate, Never Mind Predicting Its Future

ipcc-models-predict-futureA GWPF talk by Dr Christopher Essex – Chairman, Permanent Monitoring Panel on Climate, World Federation of Scientists, and Professor and Associate Chair, Department of Applied Mathematics, University of Western Ontario (Canada) in London, 12 February 2015

Has the scientific problem of climate been solved in terms of basic physics and mathematics? No, but you will be forgiven if you thought otherwise. For decades, the most rigorous treatments of climate have been done through climate models. The clever model pioneers understood many of their inherent limitations, but tried to persevere nonetheless. Today, few academics are even aware of what the pioneers understood, let alone what has been learned since about the full depth of modelling difficulties.

Meanwhile popular expressions of the scientific technicalities are largely superficial, defective, comically nonsensical, and virtually uncorrectable. All of the best physics and all of the best computer models cannot put this Humpty Dumpty together, because we face some of the most fundamental problems of modern science in climate, but hardly know it. If you think you want to have a go at those problems, there are at least a couple million dollars in prizes in it, not to mention a Fields Medal or two.

But even if you don’t have some spare afternoons to solve problems that have stymied the best minds in history, this talk will cure computer cachet even for laymen, putting climate models into theirs proper perspective.

5 2 votes
Article Rating
198 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
February 20, 2015 7:39 am

Edward Lorenz in 1972 discussed the issue of modeling. He referred to the “Butterfly Effect”, Some 40+ years later and we need to relearn it? Figures.

Hugh
Reply to  M Simon
February 20, 2015 9:09 am

The butterfly effect as such is totally unimportant. The climate is about average, not about predicting certain state in future based on boundary conditions at start. The effect you should mention in about uncertainty in parameterization, which prevents calculating an average that you could trust.
Models can handle the butterfly effect with respect to given equations, but they can’t provide with the time-dependent parameter space. Because of that, the butterfly flies back in the form of scare: maybe the butterfly flaps and the clathrates melt away! Hey, I say, now you are just guessing, back to the drawing board.

Kevin Kilty
Reply to  Hugh
February 20, 2015 10:09 am

And so, if all we are looking at is averages, then does a chaotic process, one dominated by 1/f noise or something akin to it, have an “average” in the sense that most people assume? I understand that one can always calculate an “average”, but does this average possess the sort of properties that Hansen implicitly assumed when he said “99% certain”?

son of mulder
Reply to  Hugh
February 20, 2015 10:12 am

Hugh says “The climate is about average” and I say it is about “Strange Attractors” such as the Lorenz Attractor.

Max Dupilka
Reply to  Hugh
February 20, 2015 11:30 am

An “average” in meteorology can be a very misleading concept. As an example, here in Alberta Canada in the winter we basically have two flow regimes. We are either in a cold northerly flow from the Arctic in which case daytime highs are in the -20C area, or we are in a mild westerly flow from the Pacific with highs around +5C. This is very much a bi-modal distribution of temperature with little in-between ground. So what is the meaning of an “average”? If, over the longer term, the “average” temperature in the winter increases it is mostly likely because we are more often in a milder westerly flow. And what in the underlying cause of the more persistent westerly flow?

Udar
Reply to  Hugh
February 20, 2015 1:32 pm

Average is useful for random effect. Chaotic systems – not so much. I’d say that butterfly effect is emmensely important to climate.
I Am not sure how “models can handle butterfly effect”, when modems do not have any resolution necessary to handle it, and you can’t simply average it out.

Duster
Reply to  Hugh
February 20, 2015 3:20 pm

Hugh, “averages” cut both ways. If model averages – that old ensemble – tracked the data average without serious “help” from homogenization and adjustment, you might have a point. The problem is you have not the foggiest notion of what the “Butterfly Effect,” nor, apparently, who Edward Lorenz was. On a more immediate note, you have not noticed that despite the adjustment help, the models STILL can’t track even what is being passed off as real data enough its connection with “reality” becomes steadily more tenuous and homgenized.

Hugh
Reply to  Hugh
February 21, 2015 1:10 am

I admit that average is just a number. Much of the cagw scare is about increased variance or extremes or about tipping points which you may well see as strange attractors in action.
So the climate is about averages AND variances.

Doug Badgero
Reply to  Hugh
February 21, 2015 7:23 am

Computers cannot “handle” chaos. They can simulate a response that looks chaotic. However, even if we had the data granularity to realistically initialize the model, the real world would quickly diverge from the model output.

Steve Reddish
Reply to  Hugh
February 21, 2015 12:07 pm

Max, and Hugh,
Weather reports in the MSM often refer to averages as if they are the norm, saying things like: “today was 5F above normal”, when as Max says, 5F above the average IS normal. Then, the claim is made that this variance from average is proof that things are not normal.
SR

Janice Moore
Reply to  Hugh
February 21, 2015 1:54 pm

Because Dr. Essex’s main points are overlooked, ignored, or forgotten by several of the commenters below, I’m interrupting the discussion of “The Butterfly Effect” at the first space available to insert:
8 Important Underlying Points to Bear in Mind

1. Solving the closure problem.
2. Computers with infinite representation.
3. Computer water and cultural physics.
4. Greenhouses that don’t work by the greenhouse effect.
5. Carbon free sugar.
6. Oxygen free carbon dioxide.
7. Nonexistent long-term natural variability.
8. Nonempirical climate models that conserve what they are supposed to conserve.

{at ~ 1:07:06 in the video}

Somebody
Reply to  Hugh
February 22, 2015 12:40 am

Do you have a mathematical proof that the averages you are babbling about are not chaotic? If yes, go and pick up your Nobel science prize. Before attempting to attack the climate system, which is kind of a practically infinitely more complex, try simple systems, such as a double pendulum. See if the center of mass average is chaotic or not, for example. But perhaps you prefer the arithmetic average of positions of the weights. You might learn a thing or two about chaotic systems. That is, that averages can be easily chaotic, too.

PiperPaul
Reply to  M Simon
February 20, 2015 10:22 am

Wasn’t “The Butterfly Effect” just a metaphor for randomness and not something to be taken literally (as it seems some have done)?

catweazle666
Reply to  PiperPaul
February 20, 2015 10:40 am

No.
It was to do with chaos theory – entirely different to randomness – and the way that a very minor event could nudge a process into a different attractor, or on the other hand a large perturbation might make little or no difference whatsoever.
Read James Gleick’s excellent work “Chaos”.

DC Cowboy
Editor
Reply to  PiperPaul
February 20, 2015 10:43 am

Yes, it is a metaphor. However, the dependence on initial conditions of a chaotic system was described as early as 1890 (Three Body Problem, Henry Poincare). Interestingly enough, Poincare later suggested that this ‘problem’ was specially related to fields like meteorology.
I really like Lorenz’s definition of Chaos: Chaos: When the present determines the future, but the approximate present does not approximately determine the future.
I would think that the idea of an ‘average’ has no meaning for chaotic systems, although I’m not nearly mathematician enough to even approach a ‘proof’.

DC Cowboy
Editor
Reply to  PiperPaul
February 20, 2015 10:44 am

It was also the title of Dr Lorenz’s 1972 paper, “Predictability: Does the Flap of a Butterfly’s Wings in Brazil set off a Tornado in Texas?”

Duster
Reply to  PiperPaul
February 20, 2015 3:28 pm

It is the phrase applied to describe sensitivity to initial conditions. Lorenz encountered that fact that he could not get a model, using the same to refrain from divergent behaviour. It was initially thought to be due to rounding effects due to the hardware he was using, but he soon found that the same divergence of results occurred regardless. More over, the process would cycle a particular value for an indeterminate period and then abruptly switch states and orbit an alternative state for a time. Nothing can emilinate that since it is inherent in the mathematics themselves and will appear regardless of how the system is calculated. Curiously, the plot of this cycling behaviour that Lorenz dervied resembled a butterfly.

albertkallal
Reply to  M Simon
February 20, 2015 7:01 pm

The problem (like so much of science), ideas are floated that don’t make sense. There is no such thing as a “random” event. All such events have a cause, and are subject to the laws of physics. Because we don’t know where a baseball is going to land is no excuse for throwing out the laws of physics and stating that the baseball “thinks” or makes a bunch of independent choices as it soars though the air!
For a given set of circumstances, the outcome is ALWAYS the same. This is all math, and 2 + 2 = 4.
As such, no demonstrable random event exists. The fact that we lack the tools to determine the starting parameters of any physical event does not by any reasoned logic show or prove that such outcomes are “random”. If by random we mean we cannot determine starting parameters then fine, but such logic is not a proof that events are random.
The fact that we can use statistics to show or state that the base ball will land in the field 90% of the time (just like quantum mechanics does) is simply an admission that we don’t have the ability to determine the outcome, but does NOT prove or suggest that the outcome is NOT determined based on a set of laws and rules.
So no such thing as a random event exists.
And the butterfly effect does not stand up to physics and math if you talking about some butterfly flapping its wings and causing a tornado.
Regards,
Albert D. Kallal
Edmonton, Alberta Canada

Robert Austin
Reply to  albertkallal
February 20, 2015 7:24 pm

Decay of the individual atoms of radio-nucleotides. Not applicable to our discussion but such events are random so don’t be so adamant that random events do not exist.

Reply to  albertkallal
February 21, 2015 2:10 am

Austin, radio-nucleotide decay is only random because we don’t have the tools to understand how & why they happen.

daved46
Reply to  albertkallal
February 21, 2015 5:45 am

Well, in one sense I agree that there’s no such thing as a random event but according to the Copenhagen interpretation of Quantum Mechanics random action is the be all and end all of everything. In this interpretation, every event is a random “decay” of the universal wave function. By that i mean all actions is simply a random instantiation of the set of possible states of the system. The states follow the equations of QM which produces the odds of a particular state occurring, but it’s impossible to know which one will occur (usually; but google the “Aspect experiment” for exceptions)
However, there is, so far as I can tell, no “mechanism” by which randomness can occur, which means that whatever the actual situation is, it’ seems to me to be no different from from the traditional definition of the existence of a God; albeit a Quantum God. This is finessed in the Copenhagen interpretation of QM by demanding that we (humans) cannot ask any question about how random actions occur. I personally think this was done because the founding “fathers” of the interpretation didn’t want to believe in a God existing. Your mileage may vary.

Mark T
Reply to  albertkallal
February 21, 2015 8:40 am

Um, Brownian motion…
Mark

Somebody
Reply to  albertkallal
February 22, 2015 12:45 am

“radio-nucleotide decay is only random because we don’t have the tools to understand how & why they happen” – Don’t be so certain about your uncertainty. Google up ‘hidden variables theory’. Also ‘Bell inequalities’.

DirkH
Reply to  albertkallal
February 22, 2015 6:04 pm

“And the butterfly effect does not stand up to physics and math if you talking about some butterfly flapping its wings and causing a tornado.”
Big words, contradicting the definition of Chaos. (which is an upwards shift of state information from LSB to MSB, amplifying the amplitude of deviations)
(so even the decay of a single C14 atom to Nitrogen, or even the capture of a single IR photon by a CO2 molecule, could be said to have caused the tornado – but we better not give the warmists ideas)

February 20, 2015 8:05 am

Unless history repeats itself again
http://www.vukcevic.talktalk.net/CET1690-1960a.gif

catweazle666
February 20, 2015 8:05 am

Anyone who claims that an effectively infinitely large open-ended non-linear feedback-driven (where we don’t know all the feedbacks, and even the ones we do know, we are unsure of the signs of some critical ones) chaotic system – hence subject to inter alia extreme sensitivity to initial conditions – is capable of making meaningful predictions over any significant time period is either a charlatan or a computer salesman. Or both, of course.
Ironically, the first person to point this out was Edward Lorenz – a climate scientist.
You can add as much computing power as you like, the result is purely to produce the wrong answer faster.

George Tetley
Reply to  catweazle666
February 20, 2015 8:25 am

Edward Lorenz ,
now there is a man you can quote. Well that one anyway.

JJM Gommers
Reply to  catweazle666
February 20, 2015 9:42 am

Short term is already impossible, long term (2100) is scientific blasphemia

Danny Thomas
Reply to  catweazle666
February 20, 2015 9:56 am

catweazle666,
You mean like this (From above IPCC reference):
“More generally, the intensification of the hydrological cycle with increased CO2 is a robust conclusion. For possible changes in extreme weather and climate events, the most robust conclusions appear to be: (a) an increased probability of extreme warm days and decreased probability of extreme cold days and (b) an increased chance of drought for mid-continental areas during summer with increasing CO2 (see Chapter 9, Section 9.3.6).
http://www.usatoday.com/story/weather/2015/02/20/winter-weather-cold-snow-record-temperatures/23728379/
Gotta love those “most robust conclusions”. But to their credit it did say “appear”.

Aphan
Reply to  Danny Thomas
February 20, 2015 12:52 pm

Danny and catweazle-
Danny’s quote is one of the most beautiful examples of equivocation/prevarication ever published by “science”:
“More generally (but not specifically)…possible changes (anything is possible)…..appear to be (from how we view it)…increased probability of extreme warm days….(not “increased extreme warm days…just the increased PROBABILITY of them)…decreased probability of extreme cold days (again…not an actual increase…just the PROBABILITY of one)….increased CHANCE of drought…(no actual increase predicted).
Even their comment about their conclusions isn’t robust…except robustly crap.

Danny Thomas
Reply to  Aphan
February 20, 2015 1:35 pm

Aphan,
My impression too! And even with all that, where they decided it was “robust” it wasn’t.

jorgekafkazar
Reply to  Danny Thomas
February 20, 2015 3:05 pm

“Robust” is, of course, synonymous with the Japanese “boroshitu.” And those “extreme warm days” are the product of delusional thinking. At most, nighttime minimum temperatures will rise slightly, and much of the rise will occur above 60° Latitude. But as of now, the entire AGW notion isn’t happening, falsifying the putotive (sic) climate models.

chrisyu
Reply to  catweazle666
February 22, 2015 8:05 am

Almost, adding computing power aids the illusion that the more powerful the computer the more right MUST be the answer.

rgbatduke
February 20, 2015 8:11 am

I watched a variation of this talk several years ago, and it was brilliant and, AFAICT (and I can tell a lot!) utterly correct and fair. A Kolmogorov scale cell for atmospheric air is order of 10^{-9} cubic meters (one cubic millimeter). The scale of cells used in climate models is around 10^{13} cubic meters (100x100x1 kilometers). There are 24 orders of magnitude difference. The timescale used in the climate models is determined by the size of the cells, cheating. 10^5/340 \approx 300 seconds (five minutes) is the time required for sound to cross a cell, ignoring the fact that the cell is only one kilometer thick so that there is an intrinsic dynamical mismatch in vertical dynamics and transverse/horizontal dynamics. Hence climate models use stepsizes of roughly 5 minutes, the time needed for pressure variations to propagate across a cell so that they can pretend that a cell has a homogeneous pressure and temperature. The timescale required for a Kolmogorov scale cell is 10^{-3}/340 \approx 3 microseconds. The ratio between them is another factor of 10^{8}, making climate models a stunning 32 orders of magnitude short of where they would need to be in order to reliably integrate the spatiotemporal dynamics, and even if we could integrate at this granularity the solution would still be chaotic and hence infinitely sensitive to initial conditions.
As it is, the assumptions built into the cell dynamics are merely absurd — cells are much larger than well known energy dissipating structures such as thunderstorms and hence are essentially “blind” to thunderstorm dynamics, cloud dynamics, nucleation and growth of the defects that eventually become large scale weather patterns, and more. And then there is the second coupled Navier-Stokes system — the ocean — with its enormously complicated dynamics, chemistry, and boundary. And I — or he — could go on.
To attempt to solve the unsolvable, climate modellers have to replace all of this dynamics at less than their cell scale with smoothed approximations. Thunderstorms are always 1 cell in size (they cannot be any smaller) so that where a real weather pattern might have a front with scattered thunderstorms along a 300 x 100 kilometer band, the best a climate model could do would be to have thunderstorms in 1 out of three cells or some intermediate “rainy” state assigned to the cells that is supposed to correspond to the average “thunderstorminess” and correctly add up to the right vertical heat and moisture transfer and so on. Similarly scattered clouds on a scale from meters to kilometers become some sort of crude average modulation of cell albedo and radiative transfer. This is further modulated with ad hoc corrections for GHGs, aerosols, soot, and so on, all on a granularity of 100 km square patches where a single property has to be assigned to the entire cell and then dynamically varied, timestep to timestep, for the entire cell.
All of this is perfectly obvious when one compares the actual climate trajectories produced by climate models. From tiny perturbations of initial conditions, they generate whole families of future climates, some warming, some actually cooling. The variance is enormous, and utterly non-physical. The autocorrelation times within the models themselves aren’t close to the actual autocorrelation times of the climate (how could they be? They have the wrong relaxation dynamics on nearly all time and length scales!) The fluctuations in the climate are several times larger than the actual fluctuations in the real climate. They get the temperature of the troposphere egregiously wrong. They fail to predict floods or droughts anywhere near accurately. They cannot predict large scale self-organized phenomena like ENSO that dominate discrete Hurst-Kolmogorov steps in the actual climate state and hence are just plain wrong almost everywhere, almost all of the time, even as they produce something that sorta kinda if you squint a bit looks like a plausible climate trajectory. Two different climate models produce completely different results even if run from the same initial state. Our knowledge of initial state is nonexistent and (because the real climate is highly non-Markovian, especially when coarse grained) they cannot even pretend to capture the actual climate state over the decades to centuries needed to properly initialize the model, where heat swallowed by the ocean a century ago surfaces in the thermohaline circulation to affect climate in significant ways today. And finally, nobody even tries to assess the climate models, reject the ones that perform the worst, and keep the ones that actually produce results that resemble the client, so we aren’t even optimizing on our limited set of climate model attempts to try to evolve one that sort of works.
Then comes the greatest sin of all. Taking all of the non-independent climate models produced by the various agencies and research groups (with places like NASA “contributing” roughly 1/5th of the final weight with 7 closely related variations of the same damn model), with all of their many and varied warts, without the slightest attempt to accept or reject a single model, and flat average them into a multi-model “ensemble” mean that is supposed to magically be predictive because all of the flaws in all of the models will cancel one another out!
Oh. My. God.
rgb

F. Ross
Reply to  rgbatduke
February 20, 2015 11:12 am

Great post! + a bunch.

Nick Stokes
Reply to  rgbatduke
February 20, 2015 12:21 pm

“The ratio between them is another factor of 10^{8}, making climate models a stunning 32 orders of magnitude short of where they would need to be in order to reliably integrate the spatiotemporal dynamics, and even if we could integrate at this granularity the solution would still be chaotic and hence infinitely sensitive to initial conditions.”
Well, flow around an aircraft would have many of magnitude short. And all sorts of fluctuations on the way to deriving average lift, drag etc. But that’s how the planes you fly in are designed.
CFD works.

Reply to  Nick Stokes
February 20, 2015 2:59 pm

You don’t wait until the plane is loaded with hundreds of passengers before you test the design. It would be interesting to survey aerodynamicists to see if they would get into plane with a revolutionary wing design before practical testing.

Duster
Reply to  Nick Stokes
February 20, 2015 3:35 pm

No, that’s why test pilots are paid so much and have such a difficult time buying life insurance.

Udar
Reply to  Nick Stokes
February 20, 2015 4:16 pm

CFD works
Only if validated. Not validated CFD models, just like climate models, produce unusable garbage. And no one designs planes that way

Nick Stokes
Reply to  Nick Stokes
February 20, 2015 5:20 pm

“And no one designs planes that way”
Yes they do!
“The application of CFD today has revolutionized the process of aerodynamic design. CFD has joined the wind tunnel and flight test as primary tools of the trade.”
My point is that RGB basically claims that CFD can’t work because of scale problems. Just not true.

daved46
Reply to  Nick Stokes
February 21, 2015 6:12 am

Nick, you need to show us that the CFD designs of an airfoil do actually fall short of RGB’s criteria. A link in particular would be helpful. I think you’re blowing smoke, but if you’re correct, it might be useful to see just how CFD in the engineering field overcomes such magnitude shortages as compared to how climate modeling tries to do so.

Reply to  rgbatduke
February 20, 2015 12:27 pm

Super well-said.

cba
Reply to  rgbatduke
February 20, 2015 12:28 pm

no kidding. LOL. enjoyed the telcon with you a coupla weeks back. never got to call roy s as you suggested. – maybe next yr. this time it’s gonna be gravity waves lol.

Nick Stokes
Reply to  rgbatduke
February 20, 2015 12:28 pm

“they can pretend that a cell has a homogeneous pressure and temperature”
They don’t. The Navier-Stokes equations, with coupled energy equation, deal with the conserved quantities momentum, mass and energy. The total momentum in a cell is what has to be balanced with its neighbors. No claim of homogeneity within the cell. Reynolds averaging makes this clear.

Editor
Reply to  Nick Stokes
February 20, 2015 1:35 pm

They don’t pretend that a cell has a homogeneous pressure and temperature????? It’s not possible for them to do anything else, because the cell is the smallest unit. By definition, they can’t distinguish between different places within the same cell.

Nick Stokes
Reply to  Nick Stokes
February 20, 2015 2:43 pm

“It’s not possible for them to do anything else, because the cell is the smallest unit.”
Maybe not, but that doesn’t mean they do it. You’re missing the significance of conservation. A train carriage has a mass, momentum, and a KE. There is much complexity in the interior – movement, density variation etc. But in its interaction with the neighboring carriages, its the carriage mass, momentum and KE that count. You don’t have to “pretend” that the carriage interior is homogeneous.

Udar
Reply to  Nick Stokes
February 20, 2015 4:25 pm

But in its interaction with the neighboring carriages, its the carriage mass, momentum and KE that count. You don’t have to “pretend” that the carriage interior is homogeneous.
But if your non homogeneous contents shifts enough to break the carriage, it will derail the train and completely destroy it. You assume that it is not possible in trains, as noone would load it that way – but you can not make this assumption regarding weather. Tornado or thunderstorm within a cell would be as effective in wrecking it as steel ball rolling around train carriage on sharp turn…

Luther Bl't
Reply to  rgbatduke
February 20, 2015 12:34 pm

+e

jorgekafkazar
Reply to  rgbatduke
February 20, 2015 3:22 pm

“But aside from those pesky little leaks, the St. Francis Dam works fine, Mr. Mulholland?”

Reply to  rgbatduke
February 20, 2015 6:45 pm

Eish!! … and then they expect us to believe that they can predict today the effect in a change of a few molecules of CO2 in 100 years time? Here, pull my finger!

David A
Reply to  rgbatduke
February 21, 2015 3:29 am

RGB despite all that you say the models are REMARKABLY CONSISTENT! I maintain that their consistency is highly informative. In what way are they consistent?
They ALL run consistently to warm! This is highly informative and tells us, despite that they have some very wrong fundamentals. In fact for them all to be so consistently wrong in the SAME direction is an indication that the climate sensitivity to their most fundamental control knob causative to warming, CO2, is very likely much less then they think. By simply tuning way down the climate sensitivity to CO2 used in the models, they would all become far more accurate to the observations.

David A
Reply to  David A
February 21, 2015 3:51 am

Indeed, the climate models compared to the observations are very strong evidence that the climate sensitivity to CO2 is much less then the proponents of CAGW think. The modelers have given much evidence against their pet theory but unlike true scientist, they refuse to accept the objective evidence their science consistently shows them.
Instead of using the observations to critique and correct their assumptions, they assume the mean of their consistently too warm projections, and base their projected harms (also theorized and not observed in the real world) on what is KNOWN to be wrong.
Nick Stokes can not be a serious commentator as he is to smart to not realize that all the modeled flight simulations use real world observations to make them accurate. Basically Nick is suggesting that pilots fly over mountain ranges with an altimeter that consistently tells them the plane is five thousand feet higher then it really is.
In the real world, if the engineers ignored their bad altimeter that consistently showed the planes higher then they really were, and installed them in commercial airlines, they would be held liable for the manslaughter charges of every plane that flew into the mountains.
Unfortunately CAGW proponents are also in the real world. If they were held liable for their wrong projections, and the wasted billions that result from their promotion of a consistently observable wrong modeled mean, then perhaps their would be less likely to trade their integrity for government funds, and governments would stop flying their economic future into the cliffs of reality.

Somebody
Reply to  rgbatduke
February 22, 2015 12:49 am

“because all of the flaws in all of the models will cancel one another out!” Yeah, like there is some magic that cancels errors out. I’m surprised that they do not put experts to guess values (perhaps obtained by throwing dices or something like that) then average them and pretend that errors will average out and the truth will emerge from the numerology.

Richard T
February 20, 2015 8:20 am

The GCMs cannot and likely will never be able to accurately forecast future climate states. It is not a question of larger and faster computers. The problem is embedded in the modeling process itself. There are three levels of approximations encountered in development of the model. First and perhaps most significant for the complex climate system is the completeness and accuracy of the underlying physics. If lacking in either regard, the resulting model is at best a rough approximation of the physical system. Second, the physics must be cast in mathematical terms. Embedded in the terms themselves are idealizations and approximations associated with the mathematical theory, another level of approximation. Thirdly, when the equations cannot be solved in a closed form, numerical methods are employed. Both physical space and time are discretized. The resulting solutions to the numerical forms are inherently approximations of the solutions of the continuum problem. These effects are problemmatic for nonlinear, chaotic systems. The “solutions” to the models will “stray” from the physical reality with increasing error. Overcoming these effects requires a validation process be employed demonstrating fidelity of computed states to reality. I am not aware of such for the GCMs.

Reply to  Richard T
February 20, 2015 9:37 am

Overcoming these effects requires a validation process be employed demonstrating fidelity of computed states to reality. I am not aware of such for the GCMs.
That must be the understatement of the year !

PiperPaul
Reply to  Johan
February 20, 2015 10:38 am

Should I infer that the production of ‘as-built’ engineering drawings after construction of the facility is complete is a similar concept?

David A
Reply to  Johan
February 21, 2015 4:04 am

Indeed Johan. If climate modelers were airplane altimeters, they, the climate modelers would all be in jail. The models al run way to warm to the observations. The altimeters all say the plane is 5,000 feet higher then it really is. The planes keep flying into mountain cliffs.
The governments that fly the economic policies of the states they represent keep flying their energy polices into the cliffs of engineering reality, wasting hundreds of billions of dollars on wind and solar, destabilizing their grids and destroying sustainable jobs.
They do this all based on the modeled mean projections of the IPCC which are known to be wrong. It would be criminal for altimeter engineers to ignore their always wrong altimeters Likewise I think it is criminal for governments, and government scientist to ignore their always wrong in the same direction models, and still insist that the people of the world fly with them.

David A
Reply to  Johan
February 21, 2015 4:06 am

My thanks for the above comment to Nick Stokes, who gave me the idea of comparing climate models to airplane engineering. .

old construction worker
February 20, 2015 8:23 am

If you predict a frog has wings to kept him from bumping his hind side then the computer model will prove he is flying even though you can’t see it’s wings.

February 20, 2015 8:25 am

Thanks, Dr. Chris Essex. Yes, they knew from the start, but the IPCC marched on.
IPCC – TAR 2001:
14.2.2 Predictability in a Chaotic System: (page 5)
The climate system is particularly challenging since it is known that components in the system are inherently chaotic; there are feedbacks that could potentially switch sign, and there are central processes that affect the system in a complicated, non-linear manner. These complex, chaotic, non-linear dynamics are an inherent aspect of the climate system.
From http://www.grida.no/publications/other/ipcc_tar/?src=/climate/ipcc_tar/wg1/504.htm
From the Executive Summary: (page 3)
The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible. Rather the focus must be upon the prediction of the probability distribution of the system’s future possible states
by the generation of ensembles of model solutions. Addressing adequately the statistical nature of climate is computationally intensive and requires the application of new methods of model diagnosis, but such statistical information is essential.
From http://www.grida.no/climate/ipcc_tar/wg1/pdf/TAR-14.pdf
They knew they could count on the liberal media and change governments around the world.

davidgmills
Reply to  Andres Valencia
February 20, 2015 9:26 am

The liberal media is owned by six very conservative multinational corporations who only care about profit. The insiders of these corporations probably giggle every time this meme is mentioned.

F. Ross
Reply to  davidgmills
February 20, 2015 11:07 am

An unsupported allegation. Any plausible links, proof?

Bill in IL
Reply to  davidgmills
February 20, 2015 2:03 pm

Since when is Time Warner conservative? When did Ted Turner change his stripes? Is Disney conservative? Viacom? CBS? NBC? Come on, I mean really, you think these corporations are conservative and only interested in making money?

Reply to  davidgmills
February 21, 2015 11:56 am

Someone should check the business model for MSNBC and some others, ya think?

Alx
Reply to  Andres Valencia
February 20, 2015 9:53 am

The climate system is a coupled non-linear chaotic system, and therefore the long-term prediction of future climate states is not possible. Rather the focus must be upon the prediction of the probability distribution of the system’s future possible states by the generation of ensembles of model solutions.

A simple translation of the above is, “Since we can’t get one model to predict long term, we”ll have many models predicting possibilities and the possibility of the possibilities.”
A simpler translation is, We’re using math and super computers to more accurately make stuff up.
Simplest translation is, “We’re just making stuff up.”

Santa Baby
Reply to  Alx
February 20, 2015 10:22 am

A perfect science for implementing policy based science propaganda to promote political agenda?

Santa Baby
Reply to  Alx
February 20, 2015 10:32 am

A perfect science for advocating a political agenda with policy based “science”?

george e. smith
Reply to  Alx
February 20, 2015 12:19 pm

Well all mathematics and science models is just made up stuff. We replace reality with a concept in our heads.
But we still do require that our made up notions always give the same explanation, when presented with the same observations.
The whole purpose of the theory, is to give us a pretty good idea of what to expect if we do an experiment that we have never done before, based on our “model” of what has happened when we did similar experiments.
We expect first order, that experiments only slightly different from one we’ve already done, will give observable results only slightly different from experimental results we have seen before.
So theory does not need to be unique nor does it need to follow any sort of “common sense”.
If it consistently predicts roughly what we ultimately observe, it’s s good theory; but only up to the time, when it predicts something that is quite contrary to what we observe when we do the experiment. Then we have to go back to the drawing board, and fix the bugs in the theory.
g

Reply to  Alx
February 21, 2015 11:59 am

That correct.
I got the same thing when I ran it through my translator program.

PiperPaul
Reply to  Andres Valencia
February 20, 2015 11:00 am

“…prediction of the probability distribution of the system’s future possible states
by the generation of ensembles of model solutions.”

Does this translate to: “One or more of these damned things duct-taped together might work, but let’s not check, just in case.”?

Duster
Reply to  PiperPaul
February 20, 2015 3:39 pm

No, it means there are inherent limits on the model that cannot be overcome. They are as much a part of the real system that is being modeled as other better understood elements.

David A
Reply to  Andres Valencia
February 21, 2015 4:22 am

…and yet the IPCC is still liable, as they recommend the modeled mean, of consistently wrong models, to project future harms for government policy.
The know all their models run wrong in the same direction. They ignore the most fundamental tenant of science, comparing your hypothesis to real world observations, and they recommend the governments of the world take their KNOWN to be WRONG modeled mean, and from this wrong mean, the project equally hypothetical harms (also not observed in the real world) and on the basis of know to be wrong modeled mean, and known to be wrong hypothetical harms resulting from the hypothetical and wrong projected warming, they ask the world to spend trillions, and fundamentally change forms of government to an ever more centrist global government solution.
In my mind they are liable, just as an engineer and the company he worked for would be liable if they knew the altimeters they designed all showed airplanes to be five thousand feet higher then they really were, and sold them anyway.

Reply to  David A
February 21, 2015 12:13 pm

Funny you should use the word liable since the people working for the UN (and therefore against mankind in general) consider themselves omnipotent. They come and go as they please and well above any country’s laws.
Makes it easier to lie, cheat and steal.

David A
Reply to  David A
February 21, 2015 8:34 pm

Unfortunately you are apparently correct. I have seen zero evidence of any accountability.

February 20, 2015 8:27 am

The uselessness of the GCM’s for forecasting purposes is discussed in Section 1 at
http://climatesense-norpag.blogspot.com/2014/07/climate-forecasting-methods-and-cooling.html
Here is the conclusion:
“In summary the temperature projections of the IPCC – Met office models and all the impact studies which derive from them have no solid foundation in empirical science being derived from inherently useless and specifically structurally flawed models. They provide no basis for the discussion of future climate trends and represent an enormous waste of time and money. As a foundation for Governmental climate and energy policy their forecasts are already seen to be grossly in error and are therefore worse than useless. A new forecasting paradigm needs to be adopted.
The modeling community is itself beginning to acknowledge its failures and even Science Magazine
which has generally been a propagandist for the CAGW meme is now allowing reality to creep in. An article in its 6/13/2014 issue says:
“Much of the problem boils down to grid resolution. “The truth is that the level of detail in the models isn’t really determined by scientific constraints,” says Tim Palmer, a physicist at the University of Oxford in the United Kingdom who advocates stochastic approaches to climate modeling. “It is determined entirely by the size of the computers.” Roughly speaking, an order-of-magnitude increase in computer power is needed to halve the grid size. Typical horizontal grid size has fallen from 500 km in the 1970s to 100 km today and could fall to 10 km in 10 years’ time. But even that won’t be much help in modeling vitally important small-scale phenomena such as cloud formation, Palmer points out. And before they achieve that kind of detail, computers may run up against a physical barrier: power consumption. “Machines that run exaflops [1018 floating point operations per second] are on the horizon,” Palmer says. “The problem is, you’ll need 100 MW to run one.” That’s enough electricity to power a town of 100,000 people.
Faced with such obstacles, Palmer and others advocate a fresh start.”
Climate at a time scale of human interest, is controlled by natural solar cycles especially the 1000 year periodicity which is plainly obvious in the data -see Figs 5-9 in Section 2 at the link above
The earth is just approaching, just at or just past the natural solar millennial peak- see Fig 9. If we look at the neutron count record – Fig 14 which,together with the 10Be data ,is the best proxy for solar activity, it is obvious that solar activity peaked in 1991. There is a 12 year delay between the driver peak and the global RSS temperature peak which occurred in mid 2003 since when the earth has been in a cooling trend -see http://www.woodfortrees.org/plot/rss/from:1980.1/plot/rss/from:1980.1/to:2003.6/trend/plot/rss/from:2003.6/trend
.The climate models are built without regard to the natural 60 and even more important 1000 year periodicities and lack even average common sense.
It is exactly like taking the temperature trend from say Feb – July and projecting it ahead linearly for 20 years or so. The models are back tuned for less than 100 years when the relevant time scale is millennial. The whole exercise is a joke and a disaster for the reputation of science in general.

noaaprogrammer
Reply to  Dr Norman Page
February 20, 2015 9:15 am

When I wrote Fortran programs for NOAA back in the early 1970s, I was told that if weather forecasting was to have a shred of truth toward the end of a 2-week period in the future, we would have to include actual measurements of heat transport provided by every small dust devil that occurred around the world. Since such phenomena is on the order of a few square meters, this was just a restatement of the butterfly effect, but on a slightly larger scale.

Reply to  noaaprogrammer
February 20, 2015 10:12 am

I said this a few years ago.

http://wattsupwiththat.com/2012/05/12/tisdale-an-unsent-memo-to-james-hansen/#comment-985181
Gunga Din says:
May 14, 2012 at 1:21 pm
joeldshore says:
May 13, 2012 at 6:10 pm
Gunga Din: The point is that there is a very specific reason involving the type of mathematical problem it is as to why weather forecasts diverge from reality. And, the same does not apply to predicting the future climate in response to changes in forcings. It does not mean such predictions are easy or not without significant uncertainties, but the uncertainties are of a different and less severe type than you face in the weather case.
As for me, I would rather hedge my bets on the idea that most of the scientists are right than make a bet that most of the scientists are wrong and a very few scientists plus lots of the ideologues at Heartland and other think-tanks are right…But, then, that is because I trust the scientific process more than I trust right-wing ideological extremism to provide the best scientific information.
=========================================================
What will the price of tea in China be each year for the next 100 years? If Chinese farmers plant less tea, will the replacement crop use more or less CO2? What values would represent those variables? Does salt water sequester or release more or less CO2 than freshwater? If the icecaps melt and increase the volume of saltwater, what effect will that have year by year on CO2? If nations build more dams for drinking water and hydropower, how will that impact CO2? What about the loss of dry land? What values do you give to those variables? If a tree falls in the woods allowing more growth on the forest floor, do the ground plants have a greater or lesser impact on CO2? How many trees will fall in the next 100 years? Values, please. Will the UK continue to pour milk down the drain? How much milk do other countries pour down the drain? What if they pour it on the ground instead? Does it make a difference if we’re talking cow milk or goat milk? Does putting scraps of cheese down the garbage disposal have a greater or lesser impact than putting in the trash or composting it? Will Iran try to nuke Israel? Pakistan India? India Pakistan? North Korea South Korea? In the next 100 years what other nations might obtain nukes and launch? Your formula will need values. How many volcanoes will erupt? How large will those eruptions be? How many new ones will develop and erupt? Undersea vents? What effect will they all have year by year? We need numbers for all these things. Will the predicted “extreme weather” events kill many people? What impact will the erasure of those carbon footprints have year by year? Of course there’s this little thing called the Sun and its variability. Year by year numbers, please. If a butterfly flaps its wings in China, will forcings cause a tornado in Kansas? Of course, the formula all these numbers are plugged into will have to accurately reflect each ones impact on all of the other values and numbers mentioned so far plus lots, lots more. That amounts to lots and lots and lots of circular references. (And of course the single most important question, will Gilligan get off the island before the next Super Moon? Sorry. 😎
There have been many short range and long range climate predictions made over the years. Some of them are 10, 20 and 30 years down range now from when the trigger was pulled. How many have been on target? How many are way off target?
Bet your own money on them if want, not mine or my kids or their kids or their kids etc.

catweazle666
Reply to  noaaprogrammer
February 20, 2015 10:52 am

“The point is that there is a very specific reason involving the type of mathematical problem it is as to why weather forecasts diverge from reality. And, the same does not apply to predicting the future climate in response to changes in forcings.”
Wishful thinking.
This entirely ignores the concept of self-similarity at all levels of magnification, not to mention the problem of extreme sensitivity to initial conditions or the effects of bifurcation.

Reply to  noaaprogrammer
February 20, 2015 6:52 pm

Gunga … Joeldshore effectively puts his trust in an infant sized handful of ‘scientists’ with the understanding of climate comparable to the resolution of their models in the real world … the rest of the “most of the scientists” just goes along with them because it is easier than losing one’s job or prestige.

chemman
Reply to  Dr Norman Page
February 21, 2015 9:52 am

“The whole exercise is a joke and a disaster for the reputation of science in general.”
My view of the current iteration of Climate Science being akin to alchemy doesn’t mean I think the same for the rest of science. The people I know are able to distinguish between pseudo-science and real science.

steveta_uk
February 20, 2015 8:35 am

Thanks you Dr Essex for introducing us to Carbon-free Sugar
https://www.dominosugar.com/carbonfree/faq.html

davidgmills
Reply to  steveta_uk
February 20, 2015 9:34 am

That was a hoot. I couldn’t help thinking about all of the diabetics who probably were wishing they could just get their hands on the stuff. I would not put it past Domino to put this stuff in the diabetic section of the supermarket.

steveta_uk
Reply to  davidgmills
February 20, 2015 9:36 am

No, that section is reserved for the sugar-free sugar.

davidgmills
Reply to  davidgmills
February 20, 2015 9:42 am

Isn’t that what no carbon sugar would be?

Gamecock
Reply to  steveta_uk
February 20, 2015 11:53 am

“CarbonFree® is the registered trademark of CarbonFund.org, a non-profit organization that certifies products as CarbonFree®”
“Join Now!
Become a leader in the fight against climate change: make a tax-deductible donation to offset your carbon footprint today!”
“Compensation of Leaders (FYE 12/2012)
Compensation % of Expenses Paid to Title
$271,860 16.75% Eric Carlson President”
Eric Carlson is making a very good living at his “non-profit.”

Bart
Reply to  steveta_uk
February 20, 2015 1:30 pm

C12H22O11 – No carbon there!

D Johnson
Reply to  steveta_uk
February 21, 2015 5:55 am

I once saw a product labeled “Phosphate Free TSP”. TSP is tri-sodium phosphate.

chemman
Reply to  steveta_uk
February 21, 2015 9:59 am

Once many moons ago I was in an REI outlet buying some camping equipment. They had a section of water bottles that contained dehydrated water with the instructions to just add water to rehydrate. It was the best laugh I had that day.

February 20, 2015 8:35 am

“Prediction is very difficult, especially about the future.” – Niels Bohr.
https://thepointman.wordpress.com/2011/01/21/the-seductiveness-of-models/
Pointman

Alan McIntire
February 20, 2015 8:38 am

“When I was your age, I always did it for half an hour a day. Why, sometimes, I’ve believed as many as six impossible things before breakfast.”
A statement of the White Queen― Lewis Carroll, Through the Looking-Glass, and What Alice Found There
To give credit to the original source.
Another good source for qutes is the White Knight, which Lewis Carroll modeled on himself:
“But I was thinking of a plan To dye one’s whiskers green, And always use so large a fan That they could not be seen.”

February 20, 2015 9:04 am

I blame it on Nature for not agreeing with those beatiful models made by very smart people (with apologies to Feynman).

dalyplanet
February 20, 2015 9:08 am

That was a fantastic lecture.

Curious George
February 20, 2015 9:20 am

I’ll add a seventh thing climatologists believe in: that tomorrow’s temperature readings will influence yesterday’s temperature. At least that is what is happening with NCDC temperatures.
Surprisingly, most climatologists don’t have a problem with causality working both forward and backward in time.

Alx
Reply to  Curious George
February 20, 2015 9:55 am

Well it worked well in all the Star Trek TV series. And of course Dr Who was good at it too.

Aphan
Reply to  Alx
February 20, 2015 12:59 pm

Wibbly Wobbly Timey Wimey Climatey Wimety stuff!

Winnipeg Boy
Reply to  Curious George
February 20, 2015 2:27 pm

Stephen Hawking calls it imaginary time (time moving in either direction), so it is a thing. Probably not in weather, or reality, at least our reality.

n.n
February 20, 2015 9:26 am

It’s because of chaotic (i.e. uncharacterized and unwieldy) processes that rational and reasonable people invented the scientific method to limit speculation in both time and space. The scientific domain does not even include all of a human being, planet Earth, or the solar system, let alone the “universe”. This was the beginning of enlightenment, which was based on the separation of science, philosophy, and faith.

Duster
Reply to  n.n
February 20, 2015 3:48 pm

SIr Francis Bacon explicitly described the scientific method in the late 16th century – not “people.” At the time he was contending with “scholastics” who insisted in an orderly nature, and what he referred to as “empiricalists” who effectively are equivalent to the crowd that believes things based on annecdotes. Bacon explicitly argue for the “intervention” of the experiment between opinion and nature in order to aid in developing what he thought of as “true knowledge.” He was not concerned about chaos and no one around him believed in any such thing.

Barry
February 20, 2015 9:27 am

This is misleading. Predicting a future climate state (e.g., Jan. 15, 2055) is not possible, but long term trends can be represented. Similarly, I cannot predict heads or tails on your 55th coin flip, but out of 100 flips with a (fair) coin, I bet I can closely predict the number of heads. Similarly, the exact spatial distribution of heat across the globe cannot be predicted, but global average temperatures can be, based on a global energy balance. Here’s a nice view of the chaotic system:
http://cci-reanalyzer.org/DailySummary/#

Alan Robertson
Reply to  Barry
February 20, 2015 9:40 am

in re: your statement that “This is misleading.”
———–
That is proper warning of the rest of your post.

Michael Jankowski
Reply to  Barry
February 20, 2015 9:49 am

“Similarly?” Each flip of the coin is an independent event. Global average temperature from day-to-day, year-to-year, etc, is not.
As you noted, even in your case, the coin must be tuned to being “fair.” How many parameters in a climate model need to be tuned appropriately as well, and how many actually are?
Then again, the models seem to have the predictive capabilities no better than flipping a coin, so maybe you have a point…

Reply to  Michael Jankowski
February 20, 2015 9:56 am

My feelings exactly. If it’s all a pure random process, let’s forget about the underlying physical mechanisms.

Alx
Reply to  Michael Jankowski
February 20, 2015 10:01 am

Ok, so heads is warming and tails is cooling.
Who thought what a tremendous GCM a simple coin could make!
On a serious note I have heard this coin flip analogy often used, which is bizarre since comparing climate to a coin which never changes, you’d have to believe climate never changes.

old construction worker
Reply to  Michael Jankowski
February 21, 2015 5:15 am

The trick is getting your coin to end up in the UN’s pocket

catweazle666
Reply to  Barry
February 20, 2015 10:54 am

“This is misleading.”
No it isn’t.
“but long term trends can be represented”
No they can’t.

Reply to  Barry
February 20, 2015 11:20 am

I think if you listen more carefully his point was that a complex model(non-linear, chaotic, multiple variables, many of them unkown) solved numerically will inevitably degenerate into an linear white noise(~57:00 min), with variations due to the paramaterizations used and the machine epsilon(~37:00min). Any natural climate variability that might actually be modelled early on will disappear into the noise with time.
A coin flip is very close to actually random so conventional statistics usually work for casinos, but the climate variability is not random, not uniform, and is not distributed as a Gaussian curve. Bad example. Try predicting how many vortex eddies you will get drawing a stick through calm water. Then try it when the waves are a foot high.
It’s unclear as yet whether global average temperatures can be based on a global average energy, and if the result would mean anything in the real world. The physical processes are highly non-linear(temp varies as T^4) and the equations only apply to equilibrium temperatures. Only engineers have come up with approximate solutions for specific problems in narrow ranges. There is no such thing as an equilibrium global average temperature.

Aphan
Reply to  Barry
February 20, 2015 1:09 pm

Wow. Did you just COMPARE predicting the average of an “experiment” (flipping a coin) with only TWO possible outcomes (heads or tails) with predicting what global “average temperatures” will be in the future with hundreds if not thousands of possible outcomes and call them SIMILAR?
“Predicting a future climate state (e.g. Jan 15,2055) is not possible, but long term trends can be represented.”
You just said “I can’t tell you today what the climate will be like on Jan 15, 2055, but I can tell you on Jan 15, 2055 what the climate has done between Feb 20, 2015 and Jan 15th, 2055.”
Well DUH.

Winnipeg Boy
Reply to  Barry
February 20, 2015 2:35 pm

Barry has a point. 30 years ago they should have been close on their predictions for today’s climate. If they just followed a 200 or 500 year trendline, i bet we would be bang-nuts on. But, going back to the coin analogy, they predicted that things would no longer be 50-50 so hansens prediction was that 100 flips would yield 95 heads and 5 tails because there is one more molecule of CO2 per 10,000 in our modern air.
Funny enough CO2 doesn’t affect coin flipping either.

jorgekafkazar
Reply to  Barry
February 20, 2015 3:42 pm

False analogy, Barry. Climate is not a coin, and its stochastic properties in no way correspond to a coin. We are nowhere near being able to perform a global energy balance.

Duster
Reply to  Barry
February 20, 2015 3:53 pm

The question is whether the “global average” temperature means anything. Many parts of the world experience very distinct weather regimes seasonally. Those regimes represent different modes and in regions, for instance around Fairbanks, Alaska, the difference between summer and winter may be so pronounced that there is no overlap. What would an annual mean tell you?

Reply to  Duster
February 20, 2015 7:49 pm

Bingo. What does it matter if temps in Antarctica “warms up” 10 degrees from -52 to -42, more than negating a concurrent 2 degree drop in the Northern Hemisphere? We would not be experiencing warming, but suffering from cold. Averages often have no meaning in reality – the average human has 1.9999 legs. An average global temperature tells us exactly nothing about the climate anywhere on earth, and a trivial two, three, or four degree increase in that average can occur without a corresponding climate change anywhere on the planet.

Danny Thomas
Reply to  Jtom
February 20, 2015 7:57 pm

Jtom,
Average may not mean much but “a few degrees” regionally can:http://www.weather.com/safety/winter/news/siberian-express-arctic-blast-cold-record-impacts

David A
Reply to  Barry
February 21, 2015 4:30 am

Barry, now get a coin that is constantly changing size, shape, weight distribution, and the slope, texture and density of the surface it is landing on, and now tell me that your law of large numbers still applies.
I assure you it does not, and your simple allegory is worse then useless.

Richvs
February 20, 2015 9:30 am

The biggest challenge I see is to convey to both scientific & non-scientific people alike the sheer complexity of the problem involved in modeling global climate. People never seem to be able to grasp that concept. Generally, the thought is that every problem is solveable given enough time & money.

Bubba Cow
Reply to  Richvs
February 20, 2015 10:00 am

Just tell them what Mike Hulme said – (paraphrase) “They’re not so much models as metaphors”.
i.e. they re intended to drive political and economic policy, not predict truthful observation.

PiperPaul
Reply to  Richvs
February 20, 2015 11:16 am

I work in engineering – when I (or more correctly, “we”) solve problems, often the money stops flowing in. The Climateers appear to have recognized this conundrum of working one’s way into unemployment and found a solution.

David A
Reply to  PiperPaul
February 21, 2015 4:33 am

Yes Paul , most of us non engineers have the same problem. After we do our job, we have to find more people that also want our service. The Climateers have a house that is never built, that you and I are forced to live in.

Chip Javert
Reply to  Richvs
February 20, 2015 3:37 pm

Richvs
Your “biggest challenge” suffers from 97% of the scientific community telling the other 3% (plus 100% of non-scientific people) that this CAGW fantasy is “settled science” (i.e. trust us & our models)…
I personally think it’s foolish to expect “non-scientific people” to understand “the sheer complexity of the problem involved in modeling global climate” – that would make them scientists, or at least mathematicians. Most “non-science people” will simply desert CAGW when they realize predicted warming fails to materialize

ROM
Reply to  Chip Javert
February 20, 2015 11:09 pm

Chip Javert February 20, 2015 at 3:37 pm
“Most “non-science people” will simply desert CAGW when they realize predicted warming fails to materialize”
******************
Most non science people I know who have to work with and try to make a living on what Nature generously provides and equally what Nature indiscriminately and often cruelly takes away, deserted the catastrophic warming and anthropogenic climate change meme quite a long time ago.
They see, experience, get hammered by and sometimes get blessed by Nature in ways they usually can’t imagine prior to the event and that has made them very aware of the immense and completely unpredictable vagaries of Nature and her climate.
[ Nature is always of the female gender with her moods, her swings, sometimes her subtlety and the oft times the caressing of the human psyche if we just stop for awhile and allow her to have her way with us.]
Along with that came a academically created cynicism that has become very thoroughly entrenched on the integrity and honesty of the climate scientists when they marched out of their concrete hives to harangue the non hive dwellers, those country folk who lived and earn’t their hard earned at Nature’s pleasure, on the immense damage those same folk were doing to the globe and how it all had to stop because if it didn’t then disaster and fire and brimstone and drought would rain down upon us.
And then they would go back to their polluting CO2 spewing concrete hives in their luxurious vehicles spewing all that CO2 for those hundreds of kilometres.
Their predictions were first listened to with respect for they were supposedly Very Important and very, very clever people or so they implied to us mere rural peons in no uncertain fashion.
But all their unchallengeable scientifically based predictions quickly turned to dust and became regarded as little more than much exhaling and spouting of another 40,000 PPM,of CO2 every time one of those Very Important and Very Clever People with all that Academic Knowledge on the Climate opened their mournful mouths.
As always Nature as Nature always does, decided that it would once again do it’s own thing and the hell with the Expert Opinions of those Oh So Very Important, So Clever and So Climate Knowledgeable Exalted Ones from deep within the elite of the Hive.
So we had seasons as usual , rain when and when we did not want it. cold, heat, nothing much changed nor had it changed when those of us of a more mature age looked back across the many decades of our personal past and our family’s past history and the old yarns of of bygone eras, of drought and fire and floods and rains. Nature as we knew it was still there, still doing her thing and the predictions of the Clever Ones from the Elite were nothing more than a wind that came and went.
And who knows or who cared where it went.
And we, those rural and country folk who had listened to the so very clever ones from the hives, no longer respected or believed the exalted and so clever ones
In fact we came to despise and detested them for being so dishonest and untruthful about what they knew and what they claimed they could predict into the future and for what they demanded of us here in the non hive areas but were never prepared to do likewise themselves.
The dwellers of the great Hives of mankind, Hives where now over one half of mankind’s numbers dwell in hives / cities of over 100,000 population and that after less than 20 generations of evolving from an agrarian species to a still evolving hive dwelling species and all that implies, are now almost completely isolated from Nature and no longer have a personal feeling or understanding of what Nature’s or her climate really is in the raw.
The increasing isolation from Nature as she really is and not as the hive dwellers increasingly imagine is being created by the fast moving developments of our technological era which is leading to a highly artificial technologically dependent environment and living style within the hives that increasingly requires less and less interaction with the natural climate and natural processes and therefore Nature.
In their lack of experience and long term knowledge of Nature in the raw, the hive dwellers are more and more dependent on the opinions of a self appointed elite who themselves, despite their highly opinionated vehemence on so many things supposedly natural or suposedly unnatural such as “dangerous Anthropogenic climate change”, have very little direct long term experience or any personally based long term experienced knowledge of real and natural world outside of the vast expanding hive they have dwelt in for most or all of their lives.
Exemplified by centres of the watermelon greens and global catastrophe promoting latte sipping elite being concentrated near the main centres of the hives, far removed and remote from and almost totally isolated by technology from the real natural world ; the “Environment” in their parlance, that begins, in their understanding, some many tens of kilometres outside of the hives / city’s limits.
****************
“Most “non-science people” have simply deserted CAGW when they realized the predicted warming failed to materialize”

Aphan
Reply to  Chip Javert
February 22, 2015 8:50 am

ROM,
I think I actually swooned reading that 3rd paragraph! Ol Pachauri at the UN should take writing lessons from you!

Alan Robertson
February 20, 2015 9:36 am

“There are no experts on what nobody knows.”
———————–
A statement which, if related to a true believer, is sure to be met with an angry outburst, .

F. Ross
Reply to  Alan Robertson
February 20, 2015 11:10 am

+1

Aphan
Reply to  Alan Robertson
February 20, 2015 1:13 pm

“There are no experts on what nobody knows”.
In order to say this with any certainty, wouldn’t one HAVE to be an “expert” on this topic, in order to know that both him/herself AND everyone else are indeed, not experts, on what nobody knows…..*evil grin* Enigmas, wrapped in riddles and deep fat fried in ambiguities….

David A
Reply to  Aphan
February 21, 2015 4:39 am

No. You do not have to be an expert on altimeter design to know that when planes keep flying into cliffs because the altimeter says the plane is five thousand feet higher then it really is, the altimeter design is faulty. You do not have to have one iota of a clue as to how altimeters work.
CAGW is likewise flawed. All the acclimate models are wrong in the same direction. GIGO is a simple concept.

Pierre DM
February 20, 2015 9:43 am

Thank you Dr. Essex
I now have a much better idea of the problems of modeling

Dawtgtomis
February 20, 2015 9:51 am

I had to giggle at the juxtaposed Kerrys.

Dawtgtomis
Reply to  Dawtgtomis
February 20, 2015 9:57 am

Or Carrey/Kerry actually.

J.Swift
February 20, 2015 10:06 am

As Plato’s perfect circles were only approximations of real movement, so too are computer models only approximations of real systems.

Duster
Reply to  J.Swift
February 20, 2015 3:56 pm

Plato would have argued oppositely that real motion was merely an approximation of ideal motion.

February 20, 2015 10:07 am

Statistics might indicate “long term trends,” but statistics do not apply to the future!

paullinsay
February 20, 2015 10:24 am

They’re hoping that by running many models they will somehow pull out averages that are correct even though they can’t predict the time series into the future with any accuracy. That’s a pipe dream since even modestly complicated dynamical systems can have multiple final states, attractors, that have very different statistical properties. Even worse, the initial conditions that lead to the different attractors lie on fractals so that arbitrarily small changes in them will switch the final state, i.e., you have absolutely no way of know which attractor you’ll find when the system settles down. An early discussion of this is at
Physica D 17, 125 (1985)
S. W. McDonald, C. Grebogi, E. Ott, and J.A. Yorke.
Basin boundaries of dynamical systems can be either smooth or fractal. This paper investigates fractal basin boundaries. One practical consequence of such boundaries is that they can lead to great difficulty in predicting to which attractor a system eventually goes. The structure of fractal basin boundaries can be classified as being either locally connected or locally disconnected. Examples and discussion of both types of structures are given, and it appears that fractal basin boundaries should be common in typical dynamical systems. Lyapunov numbers and the dimension for the measure generated by inverse orbits are also discussed.
The physics is wrong and the math is wrong. What’s left to discuss about climate models?

Curious George
Reply to  paullinsay
February 20, 2015 10:29 am

Two wrongs make a right, if you live off Climate Change grants.

Aphan
Reply to  Curious George
February 20, 2015 1:15 pm

Two wrongs don’t make a right, but three lefts do. 🙂

David A
Reply to  Curious George
February 21, 2015 4:46 am

It depends on their degree of wrongness. The computer climate models are about 180 degrees wrong, so three 90 degree lefts leaves them still wrong. However when they turn 180 degrees (admit that CO2 is net beneficial) then they will finally have it right.

Aphan
Reply to  Curious George
February 21, 2015 10:02 am

Hahaha! Of course you are talking about making “it right” vs “a right”, and I see no sign that they are going to voluntarily make things right. I loved your take on the altimeter analogy! I hope Nick does too.

David A
Reply to  Curious George
February 21, 2015 8:27 pm

Thanks Aphan. I will see, but it is not likely that Nick responded. He is disingenuous in many things he says.

Santa Baby
Reply to  paullinsay
February 20, 2015 10:39 am

They are policy based to promote a political agenda. It’s not a or about science.

chris moffatt
February 20, 2015 10:32 am

A computer model is simply a computer program that is designed to replicate a real world system. As with any computer program if you cannot define exactly what the process is you cannot get realistic results. We can do simulations of such things as missile flight because we know the kinematics involved and the characteristics of the missile exactly. We can similarly model many other real world events – as long as we can describe them exactly in a form the computers can handle. Mathematical forms are perfect for this – we have decades of knowledge and developed software methods for solving such problems. In addition to being able to define the process and express it mathematically you must also select the initial parameters. If these are not accurate you cannot get a realistic model result.
In addition to mathematical models reflecting physical realities there are many statistically based models. These are approximations. You will likely get a usable result but it won’t likely be quite correct. Richard T (above) has explained how approximations piled on approximations piled on complete omissions (because we just don’t know everything about climate) will lead to gross errors in short order. There still remains the problem for modelers of selection of initial parameters.
This should be a problem for climate modellers because in order to validate your model you have to replicate the past. There is no other way to validate it. This means that you have to select the exact climatic parameters for the time period you are modelling. So just what was the total world climate doing at midnight, January 1st, 1964? All the things you don’t know about climate are going to stop you cold. Your model will be rubbish and its results will be rubbish.
This should not be a mystery to anyone who took CS101/102. In my day Engineers and science students had to take those courses. I guess it’s different nowadays.
One more thing comes to mind, having written more ciode than I care to think about, is the number of just plain code errors in these programs. These are huge programs. Do we have any assurance whatever that they have been fully and correctly debugged? How many models are available for scrutiny? And who on earth would want to do the monster thankless chore of debugging just so they could be the goat? I can just hear Phil Jones now – he debugged our code and found an error in it. boo-hoo, boo-hoo.

Reply to  chris moffatt
February 20, 2015 10:39 am

Hm, backcasting isn’t validation. But GCMs even fail at that !

chris moffatt
Reply to  Johan
February 20, 2015 2:36 pm

If you can replicate the actuals with your model your model may be correct or incorrect; but with a climate model that’s all you have to use. If you can’t replicate the actuals, at least to a reasonable closeness with regularity, using different initial conditions from reality your model is wrong. No ifs, ands or buts about it.

Reply to  chris moffatt
February 20, 2015 11:28 am

You’re right, we can do a lot better with missiles than the weather. But even then you can’t get the target as a mathematical point. You get a circular area of probability(CEP). Getting the the CEP from miles to hundreds of yards is no too awfully difficult, but getting it down to feet is impossible. For that active guidance works wonders.

chris moffatt
Reply to  logicalchemist
February 20, 2015 2:29 pm

You’re quite right of course. Essentially the only way to deal with the precision problem is to truncate values, approximations leading to the situation described in the video. Of course in the dim and distant past we started with analog computers because digital ones were so limited in their function and size. With an analog computer every run is a crapshoot because of component tolerances, noise, fluctuating power etc very interesting but a little frustrating. But with a proximity fuze and an expanding rod you can get close enough on paper and in reality to do the job.

Gamecock
Reply to  chris moffatt
February 20, 2015 2:33 pm

“One more thing comes to mind, having written more ciode than I care to think about, is the number of just plain code errors in these programs.”
Likely, yes. HIGHLY LIKELY . . . because . . . .
We hear that models run on super duper computers and can take up to three months. When I was a computer scientist, I would get calls (sometimes in the middle of the night), complaining that a program had been running three hours. That a program can tie up a gazillion dollar computer for three months tells me that the computing environment is lax. Undisciplined. While I can’t prove it, careless errors are likely the norm.
In my day, a program taking three months would be scrutinized for rational design, and optimized for fastest processing. I deconstructed plenty of programs in my time to get them to run efficiently. For example, I would find database queries joining 7 tables. The vendor told them it would do it. The vendor didn’t tell them that separating them into 2-3 table joins, and saving the data in temporary tables, would give the same result in a lot less time. Technically, the earlier programmers had not made a mistake. They had taken advantage of a “feature.”
Our business environment demanded efficiency that I expect to be missing from the academic world.

jorgekafkazar
Reply to  Gamecock
February 20, 2015 4:04 pm

These models are trying to do so many 3D iterative global computations on such a tiny grid scale that they really do require months to run. No poor programming practices necessary. This calculational Tower of Babel is genuine. Insane, but genuine. Is the software well written? I doubt it. Look at Phil Jones, who couldn’t even handle a trend in an Excel spreadsheet. Read the Harry Read Me file for examples Harry found of incompetent code at UEA. Poor Harry. Poor us, who will have to pay for this hoax.

Chip Javert
Reply to  chris moffatt
February 20, 2015 3:43 pm

Chris
I can easily create a fancy “model” that 100% accurately reproduces the pre-existing record of 100 coin flips (even more than 100 if you wish!).
Only a fool would expect it to accurately predict the next 100 flips. Just saying…

Walt Allensworth
February 20, 2015 10:33 am

This quote is 14 years old… WUWT?

Alan Robertson
Reply to  Walt Allensworth
February 20, 2015 12:41 pm

The IPCC knew enough to make that quote 14 years ago, yet they persist.
WUWT?

tadchem
February 20, 2015 10:34 am

‘Climate’ is about the averages of *observations*, and is necessarily a *historical* concept.
Modelling is about drawing analogies between two similar systems – one physical and one mathematical.
Analogies are like ropes – they tie things together fairly well, but you can’t get anywhere by pushing them.
As far as the two systems – the physically observed and the mathematically predicted – align with each other, the model is a useful tool for *interpolating* to estimate the results of measurement that have not been made.
The peril is encountered when one tries to *extrapolate*. This is because each data point added to the basis for the model expands the universe of possible models by adding an entire mathematical dimension. The model available for making estimates *before* the added data point was available becomes inadequate – there is not an infinite universe of possible mathematical models that will exactly fit the original data set, only one of which will also *extrapolate* correctly to include the new data point. Until you have the new data point in hand, there is no way to know which of the new models will be the one that works correctly. Models cannot be relied upon for correct extrapolations.
This not pedantry, but a direct consequence of the Nyquist–Shannon sampling theorem.

Reply to  tadchem
February 20, 2015 1:15 pm

Analogies are like ropes – they tie things together fairly well, but you can’t get anywhere by pushing them.

True, that would be like shooting pool with a rope.

Duster
Reply to  Roy Denio
February 20, 2015 3:58 pm

+1

Aphan
Reply to  Roy Denio
February 21, 2015 10:09 am

I tried to resist..but I am weak,
All I could think of was….people like Mann are pretending to shoot pool with a broken hockey stick and no….spherical, enameled objects. *grin*

tadchem
February 20, 2015 10:37 am

Corrigendum: …there is NOW an infinite universe of possible mathematical models…

Mark from the Midwest
February 20, 2015 10:51 am

Yeah, what Dr. Essex said

John Whitman
February 20, 2015 11:10 am

Thank you Dr. Chris Essex for a calmly presented talk containing plain reasoning that provokes some fundamental thought.
Lewis Carroll was the pen name of Charles Lutwidge Dodgson who was a mathematician and logician (among other interests). He would have appreciated the talk by Dr. Chris Essex.
What is of interest to me is why the major government scientific organizations consciously and premeditatedly chose a primary strategy of prioritizing GCMs (and other types of models) in their PR promoting the observationally challenged theory of significant climate change from CO2 produced by burning fossil fuels.
There is some fundamental means of understanding the irrational climate movement if we know that why concerning the major government scientific organizations’ choice of strategy.
I am tending recently to think the answer to that ‘why’ is related to a new** philosophy of science that is specifically tailored for only government supported scientific structural processes.
** new as in becoming publically obvious starting post WWII
John

M E Wood
Reply to  John Whitman
February 20, 2015 11:52 am

If I may I think you are right. I was just going to point out that Lewis Carroll is believed to be poking fun at philosophers he knew. I wonder if the present philosophy is not a return to an old Persian and Indian philosophy where the lower orders of society were of no importance. The higher forms of human life were more evolved and did no harm to animals ( or in some cases to plants) so that they could go up into a more higher more spiritual realm. and in some forms of this philosophy the ultimate goal is to become non existent. Much like the Gnostics of the Later Roman Empire. peasants and the poor do not count in their view. Maybe there are still such philosophies in the Far East today which is why the United Nations is so keen on the Global Warming so many of them will have this at the back of their mind in all deliberations.

Alan Robertson
Reply to  M E Wood
February 20, 2015 12:17 pm

In the 70s, I lived in an Asian nation which published it’s newspapers in Chinese, rather than in the nation’s own alphabet. Only the educated elite members of society understood Chinese characters. The implications of that one simple peek into their society were sobering.
The structure of US academia revolves around the same idea.

John Whitman
Reply to  M E Wood
February 20, 2015 5:46 pm

M E Wood on February 20, 2015 at 11:52 am
M E Wood,
Certainly the meme of the enlightened few and with the rest being subservient to the few is Plato’s philosophy and the argument can be made the same for the Neo-Platonist who is Kant. Certainly Asia had its share of those kinds of views.
I do not see America being ‘de facto’ ever led by Scientist Kings who style themselves after Philosopher Kings.
John

John Whitman
Reply to  M E Wood
February 20, 2015 5:55 pm

Sorry for the blockquote tag formatting error in my comment ‘John Whitman on February 20, 2015 at 5:46 pm’
John
[removed. .mod]

John Whitman
Reply to  M E Wood
February 20, 2015 6:19 pm

[removed. .m o d]
– – – – – –
.m o d,
Thanks.
John

Jeff B.
February 20, 2015 11:13 am

Yeah the models are all junk. But let’s panic and train wreck the economy just in case. Cuz that is what good Socialists do!

F. Ross
February 20, 2015 11:15 am

Dr. Chris Essex
A very interesting and provocative post. Thank you.
Now if the media would just run with it every time there is another dire prediction by the CAGWers.

KenB
February 20, 2015 11:37 am

Every politician intending to base their decisions on model “projections” (The believers) should be made to listen to that lecture until they understood what Dr Essex is saying, Or at least understand the simple truth that the models cannot give the strength of prediction as presented to them by the charlatans and the media.

Yirgach
February 20, 2015 12:05 pm

Here is the 2013 version of this lecture. In some ways it is easier to follow, however in other ways the latest version has some more interesting data. BTW, The “green energy” laser pointer worked a lot better in 2013.
Maybe time to change the battery, doc.

Bill H
February 20, 2015 12:30 pm

I’m somewhat confused at all this hostility towards climate models since so many such models have been presented on WUWT and other skeptic sites over the years to a generally warm reception.
Examples include the recent notch model of Evans, championed by Joe Nova and VIscount Monckton, regular wuwt contributor David Archibald’s solar cycles model of 2008 predicting a 1.5 degree decrease in temperature between 2008 and 2020 (methinks we can agree on a Fail for Mr. Archibald), similar models by Easterbrook and Orssengo from some years back predicting cooling, Monckton’s modelling of the Climate Sensitivity, and so on and so forth,
IN view of the consensus on climate models expressed in this thread am I to assume that all these hitherto well-received models are now to be treated as dross?

Reply to  Bill H
February 20, 2015 12:51 pm

Bill H,
Models are not the problem; the misuse of climate models is the problem. The complete failure of climate models to make consistent and accurate predictions has discredited GCMs.
For example, no GCM was able to forecast the most significant event of the past century: the fact that global warming has stopped. And not just temporarily: global warming has been stopped [depending on the yardstick used] anywhere from 10 – 18 years. That is a long time! But no models predicted that “pause”.
Since GCMs don’t take clouds into account, naturally they will be wrong. Clouds have an enormous effect. They reflect solar energy to space. Conversely, when clouds thin they allow more energy in. See Willis Eschenbach’s related articles on emergent phenomena.
There are some very simple models that do every bit as well as the best supercomputer GCMs [if not better], and they don’t cost anything. The sensible course of action would be to sell the expensive computers, lay off the programmers and scientists, and use the simplest of models.
But we all know the real reason for expensive models and computers: it’s a gravy train that a favored clique of scientists ride. Their job is to scare the public. Their pay is continued employment, expense-paid travel to exotic locations, and relative fame in their community.
There is more real science done here than with all the multi-million dollar GCMs.

Reply to  Bill H
February 20, 2015 3:02 pm

To see how to make forecasts without using GCM type models see my earlier comment
http://wattsupwiththat.com/2015/02/20/believing-in-six-impossible-things-before-breakfast-and-climate-models/#comment-1864274
The answer to your question depends on how you define models. Monckton’s efforts are based on bottom up methods similar to the IPCC models and therefore are also not useful for forecasting . Easterbrook, Archibald and e,g, Scafetta are not of the same type being much simpler and closer to the basic solar driver and temperature data so that their outcomes will be closer to reality. However they still rely too much on curve fitting to some mathematical formula – which nature knows nothing about. It is better simply to look at patterns in the temperature and driver data and eyeball them forwards using common sense and general knowledge of past events rather than some mathematical formular.

Reply to  Dr Norman Page
February 20, 2015 7:02 pm
Duster
Reply to  Bill H
February 20, 2015 4:09 pm

Bill,
It can be argued that all science involves “models.” The idea is that science investigates subjects of interest, and by separating the system into its irreducible components, and then identifying the minimum elements that reproduce the gross system behaviour you can claim to have “understood” the system. Field science runs up against the reality that no system operates independently of its environment and that lack of independence influences the initial states of the remaining system components. That’s cool as long as the variance between the real system and the model are equal and have a similar center. When a model runs perpetually biased, and its variance doesn’t replicate the variance of real system, then you plainly have a problem. The consensus has nothing to do with it. The important measure in science is direct utility. No climate model so far has direct utility to an understanding of climate. They are well applied in the political and economic sciences where they are employed to generate opinions and funding.

Tom Crozier
February 20, 2015 12:38 pm

Great presentation, I learned a lot.

Nick Stokes
February 20, 2015 12:44 pm

The full quote says:
“In sum, a strategy must recognise what is possible. In climate research and modelling, we should recognise that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible. The most we can expect to achieve is the prediction of the probability distribution of the system’s future possible states by the generation of ensembles of model solutions. This reduces climate change to the discernment of significant differences in the statistics of such ensembles.”
It doesn’t say we can’t know anything. It says we’ll know with uncertainty that can be quantified. Which shouldn’t surprise anyone.

Yirgach
Reply to  Nick Stokes
February 20, 2015 1:23 pm

I really do not understand how we’ll know with uncertainty that can be quantified.
WTF? You really think you can quantify the uncertainty in a coupled non-linear chaotic system?
With what level of certainty is your discernment of the uncertainty?
Many, many decades ago I was exposed to LSD, Mescalin and Psilocybin.
This type of logic is eerily similar to the effects of those drugs on my perception of reality.

Alan Robertson
Reply to  Yirgach
February 20, 2015 2:40 pm

Quantified uncertainty is such a simple thing. “Make it fit the meme”.

Aphan
Reply to  Nick Stokes
February 20, 2015 3:01 pm

It says “The most we can expect to achieve is the prediction of the probability distribution of the system’s future POSSIBLE states…(by the generation of ensembles of model solutions)…”This reduces climate change to the discernment of significant differences in the statistics of such ensembles.”
Prediction- synonymns- forecast, prophecy, prognosis, prognostication, augury;projection, conjecture, guess
Probability distribution-a function of a discrete variable whose integral over any interval is the probability that the random variable specified by it will lie within that interval.
Chaotic system- Complex system that shows sensitivity to initial conditions, such as an economy, a stockmarket, or weather. In such systems any uncertainty (no matter how small) in the beginning will produce rapidly escalating and compounding errors in the prediction of the system’s future behavior. To make an accurate prediction of long-term behavior of such systems, the initial conditions must be known in their entirety and to an infinite level of accuracy. In other words, it is impossible to predict the future behavior of any complex (chaotic) system.
The most we can expect to achieve is conjecture, guessing, projecting based upon the discrete variables that we know of, that when combined in a system that contains variables that we DO NOT know of, become completely meaningless because our lack of knowledge of the entirety of the initial conditions to an infinite level of accuracy produces rapidly escalating and compounding errors in the predictions of the system’s future behavior.
If it is impossible to even identify and quantify the initial conditions of our climate to an infinite level of accuracy, then the possible levels of uncertainty are infinite as well….aren’t they?

Curious George
Reply to  Nick Stokes
February 20, 2015 4:41 pm

“We will know with uncertainty that CAN be quantified.” Three years ago I asked the NCAR to quantify an impact of an up-to 3% error in a latent heat water. They could not do it. As far as I know, the error is still there. Why should I take these clowns seriously – other than a black hole for my tax money?

Curious George
Reply to  Nick Stokes
February 20, 2015 4:45 pm

Sorry for a typo – an error in a latent heat of water vaporization.

Richard M
Reply to  Nick Stokes
February 20, 2015 7:38 pm

It’s a word game, Nick. While the statement is true to a certain degree, the probabilities provide us no information. They are all zero. That’s what you get when your off by 30 orders of magnitude.

KevinK
Reply to  Nick Stokes
February 20, 2015 8:16 pm

Nick, with respect;
“It doesn’t say we can’t know anything. It says we’ll know with uncertainty that can be quantified. Which shouldn’t surprise anyone.”
Nobody in climate science has ever attempted any SERIOUS quantification of the uncertainties. 0.001 degrees per week… really…. The climate science community has totally dismissed the “uncertainties” in the science. As soon as they can tell us the exact temperature back in 1850 on July 13 at 100 thousand locations across the surface of the Earth with thermometers that have all been calibrated to a single temperature standard the engineering community might take you seriously. Until then, the climate “models” are just a very bad joke. As explained in Dr. Essex’s well explained lecture posted here. You should listen to it several times very SLOWLY to understand the important points he has presented in well understood detail.
Not knowing the uncertainties in engineering gets people KILLED, very dead.
The climate “models” are a poor jest at best, averaging an “ensemble” of poor jests becomes a “cruel joke” and folks that believe they can tell us what will happen in a century are “NUTS”. I am empathetic to those that have dedicated their career to this, but history will not be kind to them.
Cheers, KevinK.

Janice Moore
Reply to  Nick Stokes
February 21, 2015 1:35 pm

Re: “… we’ll know with uncertainty that can be quantified.”
With 100 km wide holes in our knowledge, the HONEST thing is to say is, as Dr. Essex essentially does:
We don’t know.
**************************************
Further,
your empty assertion (far above) to the effect of: “someday, we might know” is worthless.

February 20, 2015 12:55 pm

We are all going to die from catastrophic anthropogenic climate modelling.

Bill H
Reply to  Max Photon
February 21, 2015 12:09 am

DBStealey,
I suggest you re-read Dr. Essex’s article. He is attacking the models themselves not their “misuse”, as are most of the preceding comments, e.g. Paullinsay, describing GCM’s: “the math is wrong the physics is wrong. There is no further discussion”.

John Boles
February 20, 2015 12:59 pm

GREAT LECTURE!! thanks for posting, very informative, educational.
John in Rochester Michigan

Kevin Kilty
February 20, 2015 1:33 pm

Rather the focus must be upon the prediction of the probability distribution of the system’s future possible states by the generation of ensembles of model solutions.

While I have some sympathy for this approach, if done right with an ensemble that truly does aim to cover the space of possible outcomes, the way this works in practice is akin to believing that you can take a committee of half a dozen people, none of whom are competent, but the diversity they represent will lead to a great decision.

Aphan
Reply to  Kevin Kilty
February 20, 2015 3:10 pm

(De-motivational poster of a large snowball carving it’s way down a snowy mountainside)
“A few harmless flakes working together can release an avalanche of destruction”

February 20, 2015 1:43 pm

This discussion of Navier-Stokes reminds me of a little poem I wrote some time ago.
VISCOUS THINKING
Big fools have little fools
Who feed on their “lucidity”
And little fools have lesser fools
And so on to stupidity.
~ Max Photon

Aphan
Reply to  Max Photon
February 20, 2015 5:28 pm

In science all have favorites,
Some neutron others proton,
But if you ask who I like best,
I’d have to say “Max Photon!”

Reply to  Aphan
February 20, 2015 7:04 pm

+1

February 20, 2015 1:46 pm

This discussion of Navier-Stokes reminds me of a little poem I wrote some time ago.
VISCOUS THINKING
Big fools have little fools
Who feed on their “lucidity”
And little fools have lesser fools
And so on to stupidity.
~ Max Photon

n.n
February 20, 2015 2:49 pm

Statistical inference in a chaotic system is only valid over an indefinite range. A chaotic process can only be marginally represented by a complex multivariate distribution with liberal assumptions of independence and uniformity over time and space. Without these extraordinary and unjustified assumptions, this function is not only undetermined, but its discovery remains improbable. Statistical methods, as with the scientific method itself, are only valid within an exceedingly small frame of reference in both time and space, where accuracy is inversely proportionate to the product of time and space offsets from an established frame of reference. The scientific method is a process and method invented to constrain people from conflating philosophy and faith with science.

Gary
February 20, 2015 3:08 pm

Yes yes. More of this please. I felt like I was back in school. I watched the entire thing. I’d love to see more educational videos of this sort in the future! The longer the better!

February 20, 2015 7:40 pm

Help me out here folks as I am not a “science guy”. I think I understand the problem of the current “grid” being too large due to computational and scientific limitations. However, even if the grid could be reduced and new models were based on that, how in the hell would we know what was going to happen within the grid anyway?
We will never know what the sun is going to do much less the dozens of other natural phenomena i.e. cosmic rays, planetary vegetation, winds, volcanoes, animal activity, etc. I really don’t see how given the enormity of possible combinations of climactic events we could ever really know within any degree of accuracy what global temperatures will be in 100 years. Never mind what the effect of that variability would have.
Then of course we would also need to know the efficacy of the proposed so-called “solutions” to the problem which includes among other things carbon taxes and such. Mind you all of this hysteria is based on relatively minor changes in temperatures i.e 2 degrees or so. I am not seeing it. I do wish there were more actual scientists coming forward (like the man in this presentation) to rationally discuss these things.

David A
February 21, 2015 4:55 am

Gary much of what you say is supportable. One thing however is certain. If you ignore the fact that you models are consistently wrong in ONE direction, you will always get the wrong answer.
In a nutshell this is what the IPCC is paid to do, and they do it well.

Bernie Hutchins
February 21, 2015 1:41 pm

Chris Essex had given us an important talk – obviously. Many thanks.
When we decide to solve an equation numerically (subject to finite representation) instead of in continuous closed from (analytically) we KNOW we cross SOME line, and have an “elephant in the room”. However, the usual view is perhaps that the elephant can be kept in the FAR corner by just making a step size smaller. I think Essex’s pointing out that the two cases (continuous/discrete) involve different symmetries and conserved quantities changes that schism to a more categorical one where we are hiding in the NEAR corner from the roaming elephant, and that it is the discrete modeler who is thus cornered. Often we get away with it – in the simpler cases.

February 21, 2015 11:34 pm

Моя модель Climate Change.
Climate зависит от альбедо Earth а не от CO2,метана and men,это только ускорители Climate Change.Однобокий рост внутреннего ядра Earth http://go.nature.com/w6iks3 деформируя кору изнутри http://yotu.be/edPhYeDrNIY изменяет форму планеты от которой зависит альбедо.
От альбедо и солнечной активности происходит колебание солнечной радиации в атмосфере Earth.От уровня солнечной радиации зависит давление в атмосфере,от давления режим ветров http://www.newsweek.com/speaking-green-tongues-scientist-discovers-new-plant-language-264734.Рост ядра продолжается поэтому происходит изменение альбедо и так далее по принципу домино.Постоянное колебание уровня солнечной радиации приводит к резкому и экстремальному изменению погоды в различных регионах планеты превращая Earth в планету бурь.Climate Change показатель скорости и один из флагов приближения катастрофы.
[From Google translate.
My model of Climate Change.
Climate depends on the albedo of the Earth and not from CO2, methane and men, it is only boosters Climate Change.Odnoboky growth of the inner core of Earth http://go.nature.com/w6iks3 deforming crust inside http://yotu.be/edPhYeDrNIY changes shape the planet on which depends the albedo.
From albedo and solar activity variations in solar radiation in the atmosphere Earth.Ot level of solar radiation in the atmosphere depends on the pressure, the pressure of wind regime http://www.newsweek.com/speaking-green-tongues-scientist-discovers-new-plant-language-264734.Рост core continues so there is a change of albedo and so on according to the principle domino.Postoyannoe level fluctuation of solar radiation leads to a drastic and extreme weather changes in different regions of the planet turning Earth into a planet bur.Climate Change of speed and one of the flags approaching disaster.” .mod]

kcrucible
February 22, 2015 5:37 am

“Much of the cagw scare is about increased variance or extremes ”
Only recently. Up until things started getting cold, we were never going to see snow again. This new “extreme weather” talking point was created as PR. The “99% certain” science at the time never mentioned it.
“or about tipping points”
Which is conjecture that they can’t actually back up in any meaningful way.

Jon
February 22, 2015 1:52 pm

This is same but different? https://www.youtube.com/watch?v=hvhipLNeda4

Robert S
February 23, 2015 8:16 am

lost my blogging capability

February 24, 2015 9:32 am

Chemical engineers among others routinely solve turbulent fluid flow problems graphically with just a nod to Navier stokes equation. For incompressible flow in pipes the friction factor v Re No chart for Re<2100 is derived from the Navier Stokes equation giving f=16/Re (Hagen Poiseuille eqn). Experimental data for turbulent flow gives the Blasius eqn; the whole lot for Re 10^3 to 10^8 has been conveniently charted, simplifying pressure drop calculations. Although turbulent flow in pipes and channels presents few problems to practicing engineers it is useful to be reminded that a direct solution of Navier Stokes in the turbulent region has still to be done (for $10^6). Politicians and policy makers should be constantly reminded that computer models for future climate forecasts of global warming are completely meaningless.

garymount
February 26, 2015 5:34 am