A Stove Top Analogy to Climate Models

Reposted from Dr. Roy Spencer’s blog

September 13th, 2019 by Roy W. Spencer, Ph. D.

Have you ever wondered, “How can we predict global average temperature change when we don’t even know what the global average temperature is?”

Or maybe, “How can climate models produce any meaningful forecasts when they have such large errors in their component energy fluxes?” (This is the issue I’ve been debating with Dr. Pat Frank after publication of his Propagation of Error and the Reliability of Global Air Temperature Projections. )

I like using simple analogies to demonstrate basic concepts

Pots of Water on the Stove

A pot of water warming on a gas stove is useful for demonstrating basic concepts of energy gain and energy loss, which together determine temperature of the water in the pot.

If we view the pot of water as a simple analogy to the climate system, with a stove flame (solar input) heating the pots, we can see that two identical pots can have the same temperature, but with different rate of energy gain and loss, if (for example) we place a lid on one of the pots.

Pot-of-water-example-of-same-temp-different-energy-fluxes-550x413

A lid reduces the warming water’s ability to cool, so the water temperature goes up (for the same rate of energy input) compared to if no lid was present. As a result, a lower flame is necessary to maintain the same water temperature as the pot without a lid. The lid is analogous to Earth’s greenhouse effect, which reduces the ability of the Earth’s surface to cool to outer space.

The two pots in the above cartoon are analogous to two climate models having different energy fluxes with known (and unknown) errors in them. The models can be adjusted so the various energy fluxes balance in the long term (over centuries) but still maintain a constant global average surface air temperature somewhere close to that observed. (The model behavior is also compared to many observed ocean and atmospheric variables. Surface air temperature is only one.)

Next, imagine that we had twenty pots with various amounts of coverage of the pots by the lids: from no coverage to complete coverage. This would be analogous to 20 climate models having various amounts of greenhouse effect (which depends mostly on high clouds [Frank’s longwave cloud forcing in his paper] and water vapor distributions). We can adjust the flame intensity until all pots read 150 deg. F. This is analogous to adjusting (say) low cloud amounts in the climate models, since low clouds have a strong cooling effect on the climate system by limiting solar heating of the surface.

Numerically Modeling the Pot of Water on the Stove

Now, let’s say we we build a time-dependent computer model of the stove-pot-lid system. It has equations for the energy input from the flame, and loss of energy from conduction, convection, radiation, and evaporation.

Clearly, we cannot model each component of the energy fluxes exactly, because (1) we can’t even measure them exactly, and (2) even if we could measure them exactly, we cannot exactly model the relevant physical processes. Modeling of real-world systems always involves approximations. We don’t know exactly how much energy is being transferred from the flame to the pot. We don’t know exactly how fast the pot is losing energy to its surroundings from conduction, radiation, and evaporation of water.

But we do know that if we can get a constant water temperature, that those rates of energy gain and energy loss are equal, even though we don’t know their values.

Thus, we can either make ad-hoc bias adjustments to the various energy fluxes to get as close to the desired water temperature as we want (this is what climate models used to do many years ago); or, we can make more physically-based adjustments because every computation of physical processes that affect energy transfer has uncertainties, say, a coefficient of turbulent heat loss to the air from the pot. This is what model climate models do today for adjustments.

If we then take the resulting “pot model” (ha-ha) that produces a water temperature of 150 deg. F as it is integrated over time, with all of its uncertain physical approximations or ad-hoc energy flux corrections, and run it with a little more coverage of the pot by the lid, we know the modeled water temperature will increase. That part of the physics is still in the model.

Example Pot Model (Getty images).
Example Pot Model (Getty images).

This is why climate models can have uncertain energy fluxes, with substantial known (or even unknown) errors in their energy flux components, and still be run with increasing CO2 to produce warming, even though that CO2 effect might be small compared to the errors. The errors have been adjusted so they sum to zero in the long-term average.

This directly contradicts the succinctly-stated main conclusion of Frank’s paper:

“LWCF [longwave cloud forcing] calibration error is +/- 144 x larger than the annual average increase in GHG forcing. This fact alone makes any possible global effect of anthropogenic CO2 emissions invisible to present climate models.”

I’m not saying this is ideal, or even a defense of climate model projections. Climate models should ideally produce results entirely based upon physical first principles. For the same forcing scenario (e.g. a doubling of atmospheric CO2) twenty different models should all produce about the same amount of future surface warming. They don’t.

Instead, after 30 years and billions of dollars of research they still produce from 1.5 to 4.5 deg. C of warming in response to doubling of atmospheric CO2.

The Big Question

The big question is, “How much will the climate system warm in response to increasing CO2?” The answer depends not so much upon uncertainties in the component energy fluxes in the climate system, as Frank claims, but upon how those energy fluxes change as the temperature changes.

And that’s what determines “climate sensitivity”.

This is why people like myself and Lindzen emphasize so-called “feedbacks” (which determine climate sensitivity) as the main source of uncertainty in global warming projections.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

247 Comments
Inline Feedbacks
View all comments
September 13, 2019 4:06 pm

“This is why people like myself and Lindzen emphasize so-called “feedbacks” (which determine climate sensitivity) as the main source of uncertainty in global warming projections.”

Which is why it is important to properly resolve Lord Monckton’s criticism of the feedback mathematics.

Reply to  Eric Stevens
September 14, 2019 7:03 am

I apologize in advance if this seems snide; it isn’t meant to be. But I have to ask what your criterion is for deciding whether it has been properly resolved. It seems to me that the only way in which you can resolve it for yourself is actually to master that aspect of feedback theory.

Now, with the possible exception of the nonlinearity question, the feedback-theory aspect his criticism deals with is the most-rudimentary aspect possible; indeed, it’s so basic that the control-systems texts I’ve seen don’t even deal with it; they’re concerned with non-equilibrium, usually vector, states, which require differential equations. His criticism in contrast deals only with scalar equilibrium, for which algebra is adequate.

Nonetheless, my experience is that the vast majority of readers here won’t have the patience to work through the math, so to most of those who don’t take Lord Monckton’s pronouncements as gospel it will seem unresolved.

EternalOptimist
September 13, 2019 4:07 pm

One thing is for sure

the science is settled

Clyde Spencer
Reply to  EternalOptimist
September 13, 2019 5:26 pm

EternalOptimist
That is an unsettling conclusion!

Michael Jankowski
September 13, 2019 5:28 pm

I like how the water in the pots have a known and uniform temperature.

Scissor
Reply to  Michael Jankowski
September 13, 2019 6:58 pm

Yes, even in something as small as a pot there temperature gradients, hot spots, etc. Measuring the average temperature of water in a real pot is not trivial.

Toto
September 13, 2019 5:31 pm

I want one of those pots with a lid whose coverage is a function of the water temperature. I’m tired of lid off where it takes forever to boil and lid on where it boils over. We need the Iris Lid. I hereby give the patent to public domain for the benefit of humanity.

Barbara
Reply to  Toto
September 13, 2019 7:54 pm

Just FYI, Toto, they do have those – pots with lids that have what is essentially an iris that allow you to adjust the size.

Toto
Reply to  Barbara
September 13, 2019 9:53 pm

Ah, but is it automatic? Self-regulating like the earth model of the pot?

Dnaxy
September 13, 2019 5:36 pm

How can we get more warming of the surface—on a surface that is 70% water—if we do not have more heat of condensation popping out somewhere in the overlying air column?

Alan McIntire
September 13, 2019 6:00 pm

Another item about pots boiling: at 1 Atmospheric Pressure, the water is going to boil at 100° C, and as long as there’s water, the temperature is not going to change no matter how much you turn up the burner.

I suspect that confection, conduction, and cloud formation works on earth’s temperature the same way. A lot of added energy will go into latent heat.

September 13, 2019 6:11 pm

For all their complexity, Frank has reduced climate models to a simple linear extrapolation of radiative forcing. That does not surprise me because that is what they have been tuned to achieve. He then looks under the hood of climate models and finds huge variation between one of the output variables, cloud fraction, and what has been measured. The error is so large it swamps any possible variation caused by CO2.

If climate models had predictive merit they would not need to be tuned. They would apply the fundamental physics to give accurate results for any climate variable in any location on Earth.

The fact that models are tuned without any CO2 forcing to maintain a constant temperature over time does not mean they do not have large errors. It simply means the large errors cancel each other out through the tuning process that targets no temperature change.

My simple model of Earth’s climate system has greater predictive ability than highly complex ocean – atmosphere coupled models. Water vapour increases rapidly over water surface above 20C thereby limiting tropical ocean surface temperature to around 30C; sea ice forms at -2C limiting heat loss from polar oceans; water vapour condenses as air moves from tropics to poles creating reflective clouds that limit heat input to the oceans. The area averaged global sea surface temperature is 16C – exactly the area average of 30C at tropics and -2C at the poles. Model remains accurate providing there is sea ice at both poles.

The fundamental error in climate models is the attention given to the atmosphere. This stems from the model background in weather forecasting. The energy in the climate system is in the oceans. That energy drives weather.

u.k.(us)
September 13, 2019 6:15 pm

That is all fine and good, but until someone can explain why “a watched pot never boils”, I think I’ll just stick to using the microwave, radioactivity be damned.

September 13, 2019 6:16 pm

Its all very interesting, but so what. It reminds me of my earlier days as
a Police prosecutor dealing with a rape case.

As rape seldom occurs in front of witnesses, it comes down to unless there
is physical injury to the rape victim, its a case of He said, and She said..

The “”Warmer” lobby are free to arrange their evidence to suit what they
want it tot show. So the defence side, the sceptics have to counter that.

In the Court of public opinion, results count, so we have to play the same
game as the Warmers..

Using the same basic IPCC data as the warmers use , but with items such
as clouds which I understand they do not like to use, to hopefully come up
with totally different results.

Then you can challenge the warmers as to how your models show for
example cooling, while theirs show warming. This would be far better than
lots of words such as today’s article, which most people including myself
have difficulty in understanding.

MJE VK5ELL

Mark Broderick
Reply to  Michael
September 13, 2019 6:36 pm

Great idea, if we had the funding !

TRM
Reply to  Michael
September 13, 2019 6:41 pm

Here is a couple to start with:

2014 – Dr Easterbrook predictions (Oceanic cycles):
http://cdn.cnsnews.com/documents/EasterbrookL%20coming-century-predictions.pdf
Basically cold (1960s-1970s) or very cold (1880s-1915) or OMFG cold (Dalton min) for the next 20 years.

2012-2014 David Dilley Predictions (Orbital/Gravitationsl cycles):
https://www.youtube.com/watch?v=2WaU_NJfKOE
2019-20 = rapid cooling (1880s like)
2022-2040 = Coldest phase

Richie
Reply to  TRM
September 16, 2019 4:23 am

Hence the frenzied drive to persuade the public to accept carbon taxes “before it’s too late,” i.e., 2030, when it will be too cold to sell carbon taxes to the shivering and bewildered masses.

Steven Mosher
September 13, 2019 6:44 pm

“The big question is, “How much will the climate system warm in response to increasing CO2?” The answer depends not so much upon uncertainties in the component energy fluxes in the climate system, as Frank claims, but upon how those energy fluxes change as the temperature changes.

And that’s what determines “climate sensitivity”.

This is why people like myself and Lindzen emphasize so-called “feedbacks” (which determine climate sensitivity) as the main source of uncertainty in global warming projections.”

Since I first entered this debate it was clear that this is the only point worth debating

Instead, skeptics waste all their time and energy, undermining their own credibility, by insisting
on silly arguments that are dead wrong. Like Pats. Like Hellers. Like Salbys. Like Sky dragons.
These folks earned the “D” word appellation, problem is ya’ll get tarred with it by refusing to disown
their crap.

1. It is getting warmer, there was an LIA. The temperature record is not fake. Sorry Heller

2. GHGs warm the planet, they do not cool the planet. Sorry Ned.

3. C02 is a GHG. Its a powerful trace gas that helps plants grow, and warms the planet

4. Humans are responsible for the increase in C02. Sorry Salby.

There is no point in debating these. Oh yes yes, because all knowledge is uncertain you can
try to debate them. But you will fail.

Which leaves only Three Open and interesting questions:

A) How much C02 will we emit? is RCP 8.5 crazy? (yup)
B) How much will doubling c02 warm the planet? is Nic Lewis right ( good debate)
C) How do we balance our investments in
1. Mitigation ( reducing GHGS)
2. Adaptation ( preparing for the weather of the PAST)
3. Innovation

Note: “C” isn’t a science debate.

Now I want you all to notice something. Nowhere in #1-4 are models even mentioned. We know it’s getting warmer by looking at the record. No GCMs required. Plants know its getting warmer. Insects know this. Migrating animals know this. The damn dumb ice knows this. It is silly to deny it. It is silly to argue ( as some do) that the record is a hoax. A few years ago the GWPF started its own audit of the temperature record with a panel a great skeptical statisticians. They stopped. They quit.

Next we know GHGs ( water, C02, Ch4) warm the planet, they do not cool it. We even know how they warm the planet. They reduce the rate of cooling to space. We’ve know this for over 100 years. No GCMs required.

Next we know C02 is a GHG. basic lab work. No GCMS required.

And last we know that our emmissions are responsible for the rise in C02. No GCMs required.

None of the core science, core physics even requires a GCM. Even if every GCM is fatally flawed we will still know: we put the c02 there; c02 causes warming.

Reply to  Steven Mosher
September 13, 2019 8:08 pm

History also shows us that the contribution of CO2, at rates several times larger than today, to warmer temperatures is so trivial that it is overwhelmed by other natural climate forcings. This makes items 1, 2, and 3 only of academic interest, and completely unimportant.

Reply to  jtom
September 13, 2019 9:09 pm

Agree completely. Theoretical hogwash is what this is. Astronomical and geological influence aside, the Earth has a self regulating climate – hence life. Co2, very obviously, has little to nothing to do with the climate.

Loydo
Reply to  jtom
September 14, 2019 1:03 am

” contribution of CO2, at rates several times larger than today”

Today’s rate is almost 3ppm/year. When exactly was it “several times larger”?

Reply to  Steven Mosher
September 13, 2019 10:43 pm

First of all Mr. Mosher, you pain all skeptics with the same brush. That’s bullsh*t and you’ve been arounf on this forum long enough to know it.

Second of all, the debate for the average skeptic has never been if CO2 causes warming. The debate is the ramifications of continuing fossil fuel use versus the ramifications of not continuing fossil fuel use. The first is unknown and highly debatable. The second can be pretty much predicted with a great deal of certainty, which is death in the billions.

Third, almost no one is actually in the debate. Russia not in. China not in. Africa not in. In fact almost the whole world is making noises but doing nothing but increasing fossil fuel use.

So stop painting those of us who actually have some perspective as if we’re all a bunch of idiot hill billies who never went to school and wouldn’t have graduated if we did. You’re no better than Nick Stokes. You have plenty of knowledge to bring to the discussion but instead you insist on p*ssing on everyone like you’re the only one who knows anything about anything.

Loydo
Reply to  davidmhoffer
September 14, 2019 1:28 am

“you paint all skeptics with the same brush”

No he doesn’t, he clearly delineates.

“the debate for the average skeptic has never been if CO2 causes warming”

Unforunately this is not true. It might be for you and some others but for many – like Mike below (“…you cannot use the co2 hypothesis…”) for whom the debate is exactly that. They are the D team that Steven rightly identifies.

Reply to  Loydo
September 14, 2019 1:19 pm

bullsh*t.

He says, and I quote:

problem is ya’ll get tarred with it by refusing to disown
their crap.

I’ve personally disowned all but one of the sources that Mosher listed, and I’ve never looked at Heller’s work so I do not have an opinion. The Sky Dragons have been banned from this site. Ned was totally demolished on this site. Salby has been widely criticized on this site. Yet Mosher complains that “y’all get tarred with it by refusing to disown their crap”. Seems to me they’ve been pretty much as disowned as you can get. They’re endorsed by a few repeat commenters who have no influence and don’t represent the majority of skeptics, but Mosher paints them as if they do.

Reply to  Steven Mosher
September 13, 2019 11:44 pm

Mosher……
3. ”C02 is a GHG. Its a powerful trace gas that helps plants grow, and warms the planet”

So what cooled the planet from the 40’s to the 70’s? Did someone remove the C02 that caused the warming from the warming back then? If you add and it warms then you must remove to cool right. That’s your argument not mine. Or doesn’t it work in reverse.
If you cannot answer that, you cannot use the co2 hypothesis with any accuracy. Sorry Mosher.

Loydo
Reply to  Steven Mosher
September 14, 2019 1:13 am

+++

You’ll get loads of disagreement from the D team Steven, but you wont find Spencer’s, Curry’s, Christie’s, Lindzen’s or even Watts’ names among them. Funny about that.

Reply to  Steven Mosher
September 14, 2019 1:55 am

Mosh, we know a little bit more: A doubling of the CO2 content in the atmosphere generates an additional forcing of 3.8W/m² at the TOA. No model needed. The question is: How much warming will generate the additional amount of energy. Nic Lewis used the best available observational data and found out (with an EBM) : 1.3 °C on short timescales (TCR) and 1.8 °C on longer ones, reaching an equilibrium in the oceans( ECS). The GCM-mean gives higher values. Why? This is the main remaining question. A hot trace: patterns. The GCM projections do not reflect the patterns of observed warming in the SST. In the obs. there are pronounced differences i.e. in the tropic east Pacific , not so in the GCM. Those differences bring the obs. sensitivity down. One possibility: The obs. patterns are randomly, some kind of internal variability. On the long run the models would be right in this case. The other possibility: The “patterns” are not a product by chance, they are a product of the forcing itself. This would point to GCM deficits and the (in all cooling) patterns: La Nia like patterns in the Eastpacific would persist with ongoing forcing. Two new papers bolster this solution: https://www.researchgate.net/publication/335650196_Indian_Ocean_Warming_Trend_Reduces_Pacific_Warming_Response_to_Anthropogenic_Greenhouse_Gases_An_Interbasin_Thermostat_Mechanism/link/5d7931ce4585151ee4af4295/download ; https://www.researchgate.net/publication/334012764_Strengthening_tropical_Pacific_zonal_sea_surface_temperature_gradient_consistent_with_rising_greenhouse_gases . Thsi would point to reliable sesitivity estimates following Nic Lewis also in the future and some work to do for modelers. The question is not if GHG warm the planet, the question in science is: how much!

Ethan Brand
Reply to  Steven Mosher
September 14, 2019 5:59 am

The discussion catalyzed by Frank is whether current GCMs can have useful predictive skill,nothing more,nothing less. How are your points relevant to that?
Regards,
Ethan Brand

Reply to  Steven Mosher
September 14, 2019 7:02 am

“The big question is, “How much will the climate system warm in response to increasing CO2?” The answer depends not so much upon uncertainties in the component energy fluxes in the climate system, as Frank claims, but upon how those energy fluxes change as the temperature changes.

And that’s what determines “climate sensitivity”.

Is it just me, or does this statement strike anybody else as ridiculous, because it assumes the very thing that is being questioned?

How can it not depend upon uncertainties in component energy fluxes and yet depend on changes in those fluxes, if the uncertainties in question are those very fluxes? The statement simply denies the importance of dealing with those uncertainties properly, in order to use those uncertain fluxes to make the calculations anyway and assert them as being reliable.

Reply to  Robert Kernodle
September 14, 2019 8:24 am

Steven M has outdone himself in comment … https://wattsupwiththat.com/2019/09/13/a-stove-top-analogy-to-climate-models/#comment-2794451.

Somebody with more experience than moi, please dissect it, or I’ll have to give it a go.

Reply to  Robert Kernodle
September 14, 2019 2:49 pm

Okay, I guess I’ll have a go at it:

“The big question is, “How much will the climate system warm in response to increasing CO2?” The answer depends not so much upon uncertainties in the component energy fluxes in the climate system, as Frank claims, but upon how those energy fluxes change as the temperature changes.
And that’s what determines “climate sensitivity”.

I spoke on this in my earlier comment … https://wattsupwiththat.com/2019/09/13/a-stove-top-analogy-to-climate-models/#comment-2794834

This is why people like myself and Lindzen emphasize so-called “feedbacks” (which determine climate sensitivity) as the main source of uncertainty in global warming projections.”

Feedbacks involve clouds, whose uncertainty is the issue in question. So you put aside a proper treatment of uncertainties here and proceed to use simulations based on these uncertainties as justifications for ignoring the treatment of uncertainty in these feedbacks, and then proceed to talk about feedbacks as the main source of uncertainty, when feedbacks INCLUDE cloud uncertainties — the thing in question — as if uncertainties in clouds are not part of the uncertainty in the feedbacks you wish to focus on. This is a circular refusal to focus on the issue. You can’t just say that uncertainties in component energy fluxes are the wrong focus, while, at the same time, claiming vindication for the very energy fluxes in clouds whose uncertainty are the very issue.

Since I first entered this debate it was clear that this is the only point worth debating.

There is no debate in a foregone conclusion that simply dismisses the other side of the debate, in which you subsume as proven in your assertion something that has yet to be proven.

Instead, skeptics waste all their time and energy, undermining their own credibility, by insisting on silly arguments that are dead wrong. Like Pats. Like Hellers. Like Salbys. Like Sky dragons. These folks earned the “D” word appellation, problem is ya’ll get tarred with it by refusing to disown their crap.

… empty condemnation, per my previous statements.

1. It is getting warmer, there was an LIA. The temperature record is not fake. Sorry Heller.

Depends on what you mean by “getting warmer”, and so what, … if it’s only by minimal amount? Small concession that you acknowledge a LIA (“Little Ice Age”). Stating that the temperature record is not fake does not make it so.

2. GHGs warm the planet, they do not cool the planet. Sorry Ned.

Merely stating this, while dismissing arguments to the contrary, does not make it so.

3. C02 is a GHG. Its a powerful trace gas that helps plants grow, and warms the planet.

GHG (“Green House Gas”) has always been an absurd label, since there were never any gases that act like a greenhouse, and to continue using the name after this fact became clear is a plain signal that the idea of a roof trapping heat is the idea being pushed by the mere utterance of this mistaken, long-outdated label. CO2 is an infrared-active gas (an IAG, if you will). The GHG-label encourages and enables the continued confusion of convection with radiation.

4. Humans are responsible for the increase in C02. Sorry Salby.

Merely stating this, while ignoring strong arguments to the contrary does not make it so. Sorry, Steven.

There is no point in debating these. Oh yes yes, because all knowledge is uncertain you can try to debate them. But you will fail.

… empty, over-bloated confidence.

Which leaves only Three Open and interesting questions:

… only three that YOU want to consider. There are much larger and even more interesting questions, which I’ll get to in a sec.

A) How much C02 will we emit? is RCP 8.5 crazy? (yup)

Better Question: Does it matter how much CO2 we emit? — RCP 8.5 science fiction? (yup)

B) How much will doubling c02 warm the planet? is Nic Lewis right ( good debate)

Better Question: What does it matter how much a doubling of CO2 will warm the planet? Is Nic Lewis even a relevant reference here, since the very fact that CO2 warms the planet at all can still be effectively argued? (better debate)

C) How do we balance our investments in
1. Mitigation ( reducing GHGS)
2. Adaptation ( preparing for the weather of the PAST)
3. Innovation
Note: “C” isn’t a science debate.

Okay, a good question, as long as you remove #1.

Now I want you all to notice something. Nowhere in #1-4 are models even mentioned. We know it’s getting warmer by looking at the record. No GCMs required. Plants know its getting warmer. Insects know this. Migrating animals know this. The damn dumb ice knows this. It is silly to deny it. It is silly to argue ( as some do) that the record is a hoax. A few years ago the GWPF started its own audit of the temperature record with a panel a great skeptical statisticians. They stopped. They quit.

All those statements are married to models in one way or another, whether the actual word, “model” appears in them or not.

(1) I am breathing, (2) eating regular meals, (2) walking upright consistently, (4) observing the passage of day into night. I want you to notice something — nowhere in #1-4 is living (being alive) even mentioned. Yet the actions I describe are married to the concept of living (being alive). Word-elimination games get us nowhere, especially when those words can be read between the lines.

Condemning argumentation of strongly unsettled points smacks of desperation to silence the other side.

Why was the GWPF audit stopped? Who stopped it? Was it merely stopped, impeded, ran out of funding, or shut down by refusal to cooperate with the investigators, ? I need details to know what bearing this has on the actual state of the temperature records.

Next we know GHGs ( water, C02, Ch4) warm the planet, they do not cool it. We even know how they warm the planet. They reduce the rate of cooling to space. We’ve know this for over 100 years. No GCMs required.

Sorry, but all crap, according to sound arguments to the contrary.

Next we know C02 is a GHG. basic lab work. No GCMS required.

What I know is that “GHG” is a poor choice for a name — basic communication clarity. And GCMS have the still questionable, badly-named assumption programmed into their workings.

And last we know that our emissions are responsible for the rise in C02. No GCMs required.</b.

Sorry, crap again.

None of the core science, core physics even requires a GCM. Even if every GCM is fatally flawed we will still know: we put the c02 there; c02 causes warming.

… empty preaching, per previous statements, and per numerous sound counterarguments to the contrary. The core physics and core science you refer to give physics and science a bad name. “Abhorrent physics” and “abhorrent science” might be better labels.

Jordan
Reply to  Robert Kernodle
September 14, 2019 6:49 pm

Good comment Robert.

The GCMs and the (cough) “core physics” require a tropospheric hotspot to provide a thermodynamic source for the assumed extra downwelling IR due to increasing pCO2.

But this is not observed. Somewhere out of this train wreck, Mosher still seems to “know” that increasing pCO2 causes surface warming.

Tom Abbott
Reply to  Robert Kernodle
September 15, 2019 6:52 pm

“Somewhere out of this train wreck, Mosher still seems to “know” that increasing pCO2 causes surface warming.”

Every alarmist has the same problem Mosher has: They assume things not in evidence and then extrapolate from there. Not scientific at all. It’s more like wishful thinking.

TRM
Reply to  Robert Kernodle
September 14, 2019 8:52 am

Exactly. You can have your cake and eat it too!

Reply to  Robert Kernodle
September 14, 2019 10:50 am

NS –> Game, set, match

Reply to  Steven Mosher
September 14, 2019 8:03 am

“We know it’s getting warmer by looking at the record. ”

Actually we do *NOT* know that it is getting warmer, not if you mean maximum temperatures are increasing. We know the average temperature is increasing but that can also be because of minimum temperatures going up. In fact, maximum temperatures could be decreasing if minimum temperatures are increasing more, the average would still be going up! The exact second you began using averages you totally lose sight of what is actually happening, the true data becomes masked.

If you look at the number of cooling-degree days at various sites around the globe it becomes pretty obvious that maximum temperatures are not going up. If they were going up the number of cooling-degree days would be going up also, but they are not, at least not over the past three years.

So what would cause nighttime (i.e. minimum) temperatures to go up but not maximum daytime temperatures. CO2? I highly doubt that. It would have the same impact on IR radiation whether it is day or night.

John Q Public
Reply to  Steven Mosher
September 14, 2019 10:18 am

Models are the only way to attempt to predict the future. Saying CO2 is a greenhouse gas (by the way what do greenhouses have to do with CO2? They operate on convective blocking) and will cause warming means nothing. How much warming? Is it significant?Is it beneficial? What are the natural cycles? etc.

All arguments and policy decisions come down to the models, and that is a reality. Without some predictive capability it is all hand waving.

Ktm
Reply to  Steven Mosher
September 14, 2019 5:26 pm

Mosher, what appellation should be applied to the Warmist cranks, so that both sides can distance themselves?

You know, like those who say sea level rise is an imminent threat (it’s not).

Those who say climate change is responsible for every newsworthy weather event? (It’s not)

Or that Greenland or Antarctica are melting down (they’re not).

Surely you are equally annoyed with the cranks on your side, who distract from the interesting bits.

Tom Abbott
Reply to  Steven Mosher
September 15, 2019 6:29 pm

Mosher wrote: “1. It is getting warmer, there was an LIA. The temperature record is not fake. Sorry Heller”

That’s “Hockey Stick” thinking. I can imagine Steven visualizing his favorite Hockey Stick as he recites the above.

The truth is it has only been warming since the 1970’s, and has only warmed today to levels that are equal to or less than the temperature levels in the 1930’s. All the regional, unmodified temperature charts from around the world say so.

So it is really not warming in the sense that Mosher implies which is that it is warming more than is normal for the Earth, when in fact the warming today is no warmer than the Earth was in the recent past. The regional chart for the US actually shows we have been in a temperature decline since 1934. The US is currently about 1C cooler now than in 1934. That’s not what I would describe as “warming”.

The only thing that shows warming is Steven’s bogus, bastardized Hockey Stick chart, which resembles no other unmodified regional chart on Earth. Sorry Mosher.

RockyRoad
September 13, 2019 7:22 pm

Your pot model is wrong!

The lid should be UNDER the pot!
Or if you prefer to keep the lid on top, the heat source should come from the TOP, not the bottom!

Then you would have a more accurate model!

September 13, 2019 8:53 pm

The problem here is that the only experiment” is the actual one running on Planet Earth v1.0.

Everything else is an in silico animation that can do anything the programmers wants it to do.

Leap Tall Buildings in a single bound? … yep.
Stop speeding locomotives with aoutreached hand?… yep.
Fly through the air with the power of thought? … yep.

In junk.

Pat Frank just showed the in silico from dozens of GSM-junk has no clothes.. The Emperor is naked.

It may not be perfect, because the, YES Dr Spencer some models do close the energy balance at the TOA (at least to what we know), and others do not, but the fact that the internals of a complex system simulation are all wrong, means we have no idea where the energy went with these simulations.

My suspicion is there is a lot more heat lost in the polar radiative losses than the models account for. Probably a model implementation of the Earth’s surface versus real the geodesy of the Earth problem.

Simply put, I suspect the polar TOA’s are not being given enough radiative control (energy loss) in the models. Which is why when modellers balance the TOA, their models run too hot. They got the geodesy and thus heat transport wrong. An inspection of Pat Frank’s Figure 4. strongly argues that point. The modelers do not know where the energy is going in their models. Thus everything is increasingly more junk as they time step.

EdA the New Yorker
September 13, 2019 9:12 pm

Dr. Spencer,

Two issues that haven’t been mentioned in detail, but may play a role here:

1) A previous post on WUWT demonstrated the rate of temperature increase over a 30-year span from 1910 to 1944 was nearly identical to that from 1975 to 2009, based on the HADCRUT4 set. (If someone can remember the post, please give the attribution.) IF, as you frequently contend, the atmosphere is in equilibrium, then the temperature excursion in the first half of the century must have arisen from long-term statistical fluctuations in the natural forcing. The excursion in the second half is commonly attributed to carbon dioxide forcing, along with its positive feedbacks. Your Figure 1 from your previous post shows the forcing during a 100-year annealing period on an unfortunately large scale, but from what I can make out, the longest excursion is 10 years. Converting the forcing magnitude to temperature variation is somewhat difficult, but if you have that handy for this case, I’d like to see it. This suggests to me that a zero energy gain during the annealing period is not a sufficient condition to assume a proper background to separate that effect from the carbon dioxide perturbation during the run.

2) The primary effect of OCO appears to be modeled strictly as radiative energy transfer. The collisional transfer to nitrogen, oxygen, and water vapor is going to carry a lot of that energy through the atmosphere, and will be a fairly strong function of pressure altitude and temperature. It appears to me that these effects would fall into the category of what Pat Frank asserts, namely that excluded computations can lead to a directional bias in the resulting temperature trajectories, and any distinction between natural and anthropogenic contributions is not possible.

I appreciate the work that both of you have put into this discussion.

paul rossiter
September 13, 2019 9:25 pm

All this discussion reminds me of a similar situation I was involved in some years ago. I had a theory about how the electrical resistivity of an alloy changed as precipitation of a second phase occurred, based upon the electron mean free path. Another group of scientists had an alternative theory based upon the anisotropy of conduction electron scattering. We went at each other tooth and nail in the scientific journals arguing our cases, each convinced that “our” theory best explained the observed phenomenon. We finally realized that application of either theory first made the effect of the other seem small and that in fact, both were necessary to fully explain observation.

It seems to me that the same might apply in this discussion. The various protagonists argue the merits of their own cases, whereas it is likely that all reasonable cases contribute to the whole picture of why climate models are not suited to the purpose of determining the effect of CO2 on global temperature.

I suggest this for the following reasons:

GCM’s are not based solely upon first principle physics and so have to be “tuned” so that they produce a stable result over time before the CO2 button is pressed. This implicitly assumes that either the tuning is valid over all states of the system or the system state remains unchanged when CO2 is introduced. Neither has been verified and Frank’s analysis shows how uncertainty is introduced and propagated if just one factor (LWCF) is allowed to drift over possible values. Mosher and Lindzen correctly argue that uncertainties in the feedback processes similarly throw GCM predictions into serious doubt. Spencer correctly argues that the lack of inclusion of specific of factors like water vapour feedback renders the GCM’s worthless. There is also the fact that none of the GCM’s can account for the long term historical (natural?) record, and none of them properly incorporate the effects of external drivers like the orbital effects, the influence of cosmic rays on cloud formation, and so on.

Similarly, the physical data record has some serious issues to resolve: the fact that changes in atmospheric CO2 seem to lag behind temperature changes on most if not all time scales violating the notion of CO2 as a major causing factor, and surely this is a most critical piece of evidence requiring validation; the land surface temperature record has serious deficiencies, like why only the night time temperatures around cities show significant increase over recent times (obviously UHI) and yet these have a strong influence in the homogenisation process leading to the suspicion that any changes the “global temperature average” (surface or near surface) may be reflecting increased urbanisation rather than any direct CO2 effect, similarly how much might the satellite data be due mainly to urbanisation, “natural” causes or CO2 effects, and so on. All of this is compound by cherry picking data from the historical record to push particular agendas.

My apologies for any errors in reducing all the well constructed arguments of the various proponents to just a few words, all I am trying to suggest is that maybe we would all be better served if we started to take a more holistic approach to the issue of climate change: all the reasons why GCM’s fail in accounting for any influence of CO2; what are the critical physical data required to invalidate a hypothesis and how well are they established? This does not mean that new ideas should not be tested rigorously, of course they should and the current discussion on the Frank proposal is a good example of such testing. What I am suggesting that if they are found to have merit, further arguing for one over another is just not necessary, it is actually probably counterproductive.

I am of course assuming that all concerned are driven by scientific altruism and not personal or political agendas!

Anton Eagle
September 13, 2019 10:58 pm

One aspect of this debate seems to be going unmentioned. Folks keep discussing the GCM’s as if they are experiments… and produce results from measurements of some real process.

But, they are NOT experiments. They are programs, and only produce the output they are programed to produce. The fact that 20 models produce similar results doesn’t prove a darn thing… because they are programmed to produce that result… they can’t produce any other!

In fact, if you wanted to, you could write another program that could examine the inputs and code for the computer models, and predict the outputs… WITHOUT EVER RUNNING THE MODELS. Computer models are NOT experiments.

And, although Dr. Frank seems correct is asserting that error propagation results in significant uncertainty… it doesn’t matter. The problem with the GCMs is not the uncertainty or error propagation… it is the models themselves, and the fact that computer models don’t prove anything as they just regurgitate the result you tell them to.

Reply to  Anton Eagle
September 14, 2019 3:10 am

“Computer models are NOT experiments.”

I know very little physics. But this is so basic that anybody should be able to understand it.

For a model to be skillful, it must meet two criteria, at the least:

1. It must be derived from nature

2. It must be successfully tested against nature

That’s how we have high certainty of F=ma, but very low certainty of pretty much any computer model.

John Q Public
Reply to  Karim Ghantous
September 14, 2019 10:22 am

Frank Pat effectively performed “2”, and the models failed.

Antero Ollila
September 13, 2019 11:15 pm

After reading the comments I feel that the majority of people have not understood the basic feature of the GCMs. A direct quote: “This is why climate models can have uncertain energy fluxes, with substantial known (or even unknown) errors in their energy flux components, and still be run with increasing CO2 to produce warming, even though that CO2 effect might be small compared to the errors. The errors have been adjusted so they sum to zero in the long-term average.”

I would say this even more simply: Climate models do not adjust or change cloud effects on an annual basis. They keep the cloud forcing effect constant during the coming years. Therefore there is no propagation error of cloud effects in the model runs. Dr. Spencer says this in another way that the sum of the long-term average is zero. For me, it is the same thing.

It looks like nobody really knowing the properties of GCMs have ever commented on this issue.

Reply to  Antero Ollila
September 14, 2019 8:15 am

If the cloud forcing effect is highly uncertain itself then it contributes to the uncertainty of the model output. That uncertainty adds with each successive iteration of the model. Merely holding the cloud forcing effect constant doesn’t change this one iota.

John Q Public
Reply to  Tim Gorman
September 14, 2019 10:24 am

In fact it is probably the very reason the models fail. If there were a compensatory mechanism, it would have dampened the perturbation (at least in the actual GCMs as opposed to the emulator).

TRM
Reply to  Antero Ollila
September 14, 2019 10:57 am

I have a problem with this part: “Climate models do not adjust or change cloud effects on an annual basis. They keep the cloud forcing effect constant during the coming years”

Not in the accuracy of the statement about how GCMs deal with clouds but with
A) The assumption that in reality cloud cover is stable. What length of sattelite images/measurements do we have of global cloud cover?
B) Even if stable what about placement of cloud cover? A cloud in the tropics prevents the ocean warming greater than a cloud in far the north/south due to angle of solar energy.

Paramenter
September 14, 2019 3:00 am

I reckon Dr Spencer’ post actually confirms main tenants of Dr Frank’ article.

with all of its uncertain physical approximations or ad-hoc energy flux corrections

In other words: if models were not subjected to constant ad-hoc adjustments and tinkering they would run wildly, possibly in the region of +/-20 C for the next decades. Yet, when a model wants to go haywire at this very moment a modeller appears to rescue, like deux ex machina, and by ad-hoc adjustments and corrections forces model predictions to look more sensible.

This is why climate models can have uncertain energy fluxes, with substantial known (or even unknown) errors in their energy flux components, and still be run with increasing CO2 to produce warming, even though that CO2 effect might be small compared to the errors. The errors have been adjusted so they sum to zero in the long-term average.

That’s unclear for me. If errors were adjusted that means these are not errors anymore. If we can nicely balance out in our model all energy terms that means there are no unknowns errors anymore, at least in our mathematical model – all variables representing energy fluxes balance out precisely.

Antero Ollila
Reply to  Paramenter
September 14, 2019 8:38 am

That what I expected that someone thinks that climate modelers adjust on a yearly basis a proper correction term to get a wanted temperature output. It is not so. That kind of model has no meaning and GCMs do not behave like that.

It seems to be too difficult to accept this basic simple property of GCMs: cloud forcing is not variable in those models but is a constant effect. IPCC says this way: cloud feedback has not been applied. It means that cloud forcing is not changing according to temperature variations or according to GH gas concentrations.

Reply to  Antero Ollila
September 14, 2019 2:22 pm

“It seems to be too difficult to accept this basic simple property of GCMs: cloud forcing is not variable in those models but is a constant effect. ”

Except clouds are *not* a constant effect. Cloud cover is highly variable from day-to-day, month-to-month, and year-to-year. If cloud cover is highly variable then their “effect” is highly variable also. And since that variability has a direct impact on the entire thermodynamic system we know as Earth then any model not accounting for that variability has to have a highly uncertain result.

TRM
Reply to  Tim Gorman
September 14, 2019 2:57 pm

Exactly. Above I had written about the placement of clouds having an affect as well as the daily, monthly, yearly possibilities you correctly cover:

“Even if stable what about placement of cloud cover? A cloud in the tropics prevents the ocean warming greater than a cloud in far the north/south due to angle of solar energy”

Jordan
Reply to  Paramenter
September 14, 2019 10:05 am

The errors have been adjusted to make them look like they are zero. The uncertainties cannot be so adjusted, and remain undiminished (but unnoticed, it would appear).

The uncertainties are there at the outset, and can never be reduced.

Some hope for uncertainty cancellation, in the way a noise process might revert to mean in aggregate. Model uNcertainty doesn’t do that, especially in an iterative model.

September 14, 2019 3:43 am

Anton Eagle plus the following comment have it correct.

A suggestion, we are told by the Warmers that its only from the late
1950 tees that we have a problem with CO2.

But what about the massive increase in war industry from about 1937,
lots and lots of CO2. So when did it start to get warmer. ?

Why it got cooler, and we were told to prepare for the Ice Age. Lots of
what to do stuff from some of today’s warmers of course, but now its heat.

What to do, lets Panic. the end of the world is nice. stuff.

So what about telling the Warmers to take their wonderful intelligent
computers back to say 1937 when industry started to produce massive
amounts of war material and of course lots of CO2.

Then see what the PC’s come up with, especially as to why it got colder by
the 1970 tees

MJE VK5ELL

John Q Public
Reply to  Michael
September 14, 2019 10:26 am

“I want you to feel my fear”, Greta d’Green

Jeff Id
September 14, 2019 4:19 am

Dr. Spencer is correct.

Schrodinger's Cat
September 14, 2019 4:36 am

What happens to the pot model if we introduce another factor which can have a much greater effect than the lid configuration? Let us say that this is the calorific value of the flame (cloud coverage) and we have no means of measuring it.

We can make assumptions, we can still run our model, we can still tweak the gas flow and the lid position and we can still calculate the relative effects.

But do we have the same confidence in the results? We immediately have an attribution problem and if the calorific value happens to be dominant (but unknown) what confidence do we have in the calculated effect of the lid?

Jeremiah Puckett
September 14, 2019 4:55 am

Interesting, but “greenhouse gases” aren’t a lid, unless the lid has massive golf ball sized holes in it. The energy can still escape. The question is whether it can escape at the same rate as having no lid. They are called greenhouse gases because that’s where all this nonsense came from. Someone observed that a greenhouse is warmer and wanted to know why. The earth isn’t a greenhouse. It is an open system.

Mark Broderick
Reply to  Jeremiah Puckett
September 14, 2019 6:58 am

“unless the lid has massive golf ball sized holes in it. ”

Exactly, the “lid” should only cover 0.04% of the pot !

Mickey Reno
September 14, 2019 6:33 am

Charles, I can appreciate the dilemma you’re pointing out, here. But it’s your example of a simplified model that still has the problem. If you ASSUME that the system must balance, and then you build a model that forces it to balance, you’ve done nothing to understand the system. You’ve only employed circular logic that can assist you not at all.

It’s easy to say we might know roughly what the air temperature is above the pot of water warmed by a flame, or how the lid will effect the temperature immediately above the pot. But since the pot lid is the only thing you’re allowing to change, you’re not modeling the open system that the climate is. Clouds do not ONLY “trap” heat. They also reflect light directly back into space. To model the Earth’s climate, your example must also allow the strength of the effect of the lid to also change the size of the flame. If the size of the pot’s lid reduces the amount of heat from the flame, then your output changes completely. If the size of the pot increases the amount of heat from the flame (as CAGW theory posits), then why has not the Earth’s climate gone into meltdown mode from some period of earlier warming, and already boiled away the oceans? Since we don’t know with any precision how that works in the climate, we can’t model it. But the fact that we still have liquid water on the planet, we ought to assume negative feedback rather than positive feedback, as climate models do. None of the models is helping to answer the question of what is the nature of the feedback. And their circular reasoning will never help answer the question of feedback, because they’re, well, circular.

Your model is too simple to be analogous to the Earth’s climate, and so a pointless exercise.

Tommy
September 14, 2019 8:23 am

I’m struggling with this part of Dr. Spencer’s analysis:

“This is why climate models can have uncertain energy fluxes, with substantial known (or even unknown) errors in their energy flux components, and still be run with increasing CO2 to produce warming, even though that CO2 effect might be small compared to the errors. The errors have been adjusted so they sum to zero in the long-term average.”

In particular, the last sentence. How can you “sum errors” to zero? The errors here, as I understand it, are an expression of uncertainty about a model being an accurate expression of the real world. What trick is this, that let’s you net out uncertainty by *adding* more errors into to the system? The only way you can net errors out, in the model, is by tuning for a particular preconceived outcome, but that’s a world apart from actually claiming to model reality.

Dr. Spencer seems to be saying the models can be tuned and parameterized to consistently output “reasonable” (within the bounds of what we’d expect in the real world) and “precise” (consistent with itself and other models) predictions.

What Dr. Frank seems to be saying is, so what. You should have very little confidence that they’re modeling reality. Dr. Spencer even seems to concede this point, when he says,

“Climate models should ideally produce results entirely based upon physical first principles. For the same forcing scenario (e.g. a doubling of atmospheric CO2) twenty different models should all produce about the same amount of future surface warming. They don’t.”

It’s like the two professors are talking past each other. Dr. Spencer saying “it can’t be done” and Dr. Frank saying “yes, but it’s without meaning”.

The analogy I can think of is in financial modeling, because I’ve seen a lot of shenanigans there. Suppose you have a business and you want to forecast your profits. You hire a consultant who builds a super fancy model based on first principles you’ve established together. For example, you know your profits next year depend an awful lot on how many wingdings you sell in Kalamazoo at Christmas, among other things. And, your profits next year depend on your profits the previous year (say, because it affects how many wingdings you and make and ship to Kalamazoo).

And, your consultant builds a model whose output looks totally reasonable and even correctly hindcasts your profits. Notably, the models produce higher profits when a “consultant impact” parameter is present.

Would you trust that? Probably not. Then, Dr. Frank walks in and points out that the consultant’s historical model is wildly incorrect at predicting the number of wingdings sold at Christmas in Kalamazoo. It’s been tuned to produce the “right” final result. It’s a model of something. But it’s certainly not a model of your business.

September 14, 2019 9:41 am

Roy Spencer is correct. The models are spun up to a steady state. The errors suggested by Frank do not appear in this steady state. Think of attribution. We know the errors are there we say. Where? In there. We know we caused more than half the warming. Where? It’s in there.

You can get about everything wrong in a CMIP. But if you get CO2 right, what does it matter? The next step is to get CO2 and water vapor as a change in that GHG right. And then the next step is add ocean storage. Lewis and Curry tried to do this. You can still have everything else wrong as long as you get the spin up steady state right and the CO2 and water vapor right and the ocean storage right. That an error can be really really big, doesn’t mean it matters.

We have CMIPs. They aren’t going anywhere. We may not like their shortcomings, but it’s like not liking money. You don’t trust it. Not being backed by gold.

I found a mistake once in an audit of a company. It could have caused everything to crash and to indicate the company had no money. It didn’t turn out that way. The company was fine. The thing I found could’ve caused something to happen but it didn’t.

Barbara
Reply to  Ragnaar
September 14, 2019 2:17 pm

If “you can get about everything wrong in a CMIP,” then why have a CMIP?

Reply to  Barbara
September 14, 2019 3:29 pm

For studying changes. My point was to eliminate an argument. To the extent the error is supposed to be a problem, that is cancelled. I think because the model stays in balance or seeks it.

Reply to  Ragnaar
September 14, 2019 4:31 pm

Errors may cancel. Uncertainty does not.

Barbara
Reply to  Ragnaar
September 14, 2019 2:19 pm

Or, if “you can get about everything wrong,” then why not simplify the model down to what you think needs to be right, and proceed from there?

Ktm
Reply to  Barbara
September 14, 2019 6:01 pm

Then you end up with a simple linear model, where people like Dr. Frank start asking standard questions using standard procedures about suitability and assumptions and uncertainties.

All the extra unnecessary knobs provide a handy excuse of complexity.

Barbara
Reply to  Ktm
September 14, 2019 6:34 pm

Ktm – exactly what I thought.

Jackie Treehorn
September 14, 2019 9:42 am

Because the models don’t capture the physics, they are “tuned” to what they’re supposed to predict. Just ask IPCC on this. They admit it themselves: the models are formulated with data containing correlates of the physical effects they are purportedly trying to predict. In Data Science, and Mathematical Modeling generally, that’s called “leakage”. It’s a no-no. It’s cheating. It’s like saying: Tell me how much rain and frost, etc., we’ll get and I can predict crop yields. The point is to predict the rain and frost, etc..