Global Energy Balances … Except When It Doesn’t

Guest Post by Willis Eschenbach.

I came across an interesting 2014 paper called The energy balance over land and oceans: an assessment based on direct observations and CMIP5 climate models. In it, they make a number of comparisons between observational data and 43 climate models regarding the large-scale energy flows of the planet. Here’s a typical graphic:

Figure 1. ORIGINAL CAPTION: “Fig. 7 Average biases (model—observations) in downward solar radiation at Earth’s surface calculated in 43 CMIP5 models at 760 sites from GEBA. Units Wm−2”. The “CMIP5” is the “Computer Model Intercomparison Project 5”, the fifth iteration of a project which compares the various models and how well they perform.

Now, what this is showing is how far the forty-three models are from the actual observations of the amount of solar energy that hits the surface. Which observations are they comparing to? In this case, it’s the observations stored in the “Global Energy Balance Archive), GEBA. Per the paper:

Observational constraints for surface fluxes primarily stem from two databases for worldwide measurements of radiative fluxes at the Earth surface, the global energy balance archive (GEBA) and the database of the Baseline Surface Radiation Network (BSRN).

GEBA, maintained at ETH Zurich, is a database for the worldwide measured energy fluxes at the Earth’s surface and currently contains 2500 stations with 450‘000 monthly mean values of various surface energy balance components. By far the most widely measured quantity is the solar radiation incident at the Earth’s surface, with many of the records extending back to the 1950s, 1960s or 1970s. This quantity is also known as global radiation, and is referred to here as downward solar radiation. Gilgen et al. (1998) estimated the relative random error (root mean square error/mean) of the downward solar radiation values at 5 % for the monthly means and 2 % for yearly means.

So downwelling solar radiation at the surface is very well measured at a number of sites over decades. And surprisingly, or perhaps unsurprisingly given their overall poor performance, the climate models do a really, really bad job of emulating even this most basic of variables—how much sunshine hits the surface.

Now, bear in mind that for these models to be even remotely valid, the total energy entering the system must balance the energy leaving the system. And if the computer models find a small imbalance between energy arriving and leaving, say half a watt per square metre or so, they claim that this is due to increasing “net forcing, including CO2 and other GHGs” and it is going to slowly heat up the earth over the next century.

So their predictions of an impending Thermageddon are based on half a watt or a watt of imbalance in global incoming and outgoing energy … but even after years of refinement, they still can’t get downwelling sunlight at the surface even roughly correct. The average error at the surface is seven watts per square metre, and despite that, they want you to believe that they can calculate the energy balance, which includes dozens of other energy flows, to the nearest half a watt per square metre?

Really?

Now, I wrote my first computer program, laboriously typed into Hollerith cards, in 1963. And after more than half a century of swatting computer bugs, I’ve learned a few things.

One thing I learned is the mystic power that computers have over peoples’ minds. Here’s what I mean by “mystic power”—if you take any old load of rubbish and run it through a computer, when it comes out the other end, there will be lots of folks who will believe it is absolutely true.

For example, if I were to tell you “I say that in the year 2100 temperatures will average two degrees warmer than today”, people would just point and laugh … and rightly so. All I have to back it up are my assumptions, claims, prejudices, and scientific (mis)understandings. Anybody who tells you they know what the average temperature will be in eighty years is blowing smoke up your astral projection. Nobody can tell you with any degree of certainty what the average temperature will be in two years, so how can they know what the temperature will be in eighty years?

But when someone says “Our latest computer model, which contains over a hundred thousand lines of code and requires a supercomputer to run it, says that in the year 2100 temperatures will be two degrees warmer than today”, people scrape and bow and make public policy based on what is nothing more than the physical manifestation of the programmers’ same assumptions, claims, prejudices, and scientific (mis)understandings made solid.

And how do we know that is a fact, rather than just a claim that I’m making based on a half-century of experience programming computers?

Because despite the hundred thousand lines of code, and despite the supercomputers, and despite the inflated claims, the computer models can’t even calculate how much sunshine hits the surface … and yet people still believe them.


Here, after a week of rain, we had sunshine today and we’re looking at a week more. And if you think the models are bad at figuring out the sunshine, you really don’t want to know how poorly they do regarding the rain …

My best to everyone,

w.

0 0 votes
Article Rating

Discover more from Watts Up With That?

Subscribe to get the latest posts sent to your email.

112 Comments
Inline Feedbacks
View all comments
Nick Schroeder
January 24, 2019 6:16 pm

So what, exactly, do they presume to measure?

Emissivity & the Heat Balance

Emissivity is defined as the amount of radiative heat leaving a surface to the theoretical maximum or BB radiation at the surface temperature. The heat balance defines what enters and leaves a system, i.e.
Incoming = outgoing, W/m^2 = radiative + conductive + convective + latent

Emissivity = radiative / total W/m^2 = radiative / (radiative + conductive + convective + latent)

In a vacuum (conductive + convective + latent) = 0 and emissivity equals 1.0.

In open air full of molecules other transfer modes reduce radiation’s share and emissivity, e.g.:
conduction = 15%, convection =35%, latent = 30%, radiation & emissivity = 20%

The Instruments & Measurements

But wait, you say, upwelling (& downwelling) LWIR power flux is actually measured.

Well, no it’s not.

IR instruments, e.g. pyrheliometers, radiometers, etc. don’t directly measure power flux. They measure a relative temperature compared to heated/chilled/calibration/reference thermistors or thermopiles and INFER a power flux using that comparative temperature and ASSUMING an emissivity of 1.0. The Apogee instrument instruction book actually warns the owner/operator about this potential error noting that ground/surface emissivity can be less than 1.0.

That this warning went unheeded explains why SURFRAD upwelling LWIR with an assumed and uncorrected emissivity of 1.0 measures TWICE as much upwelling LWIR as incoming ISR, a rather egregious breach of energy conservation.

This also explains why USCRN data shows that the IR (SUR_TEMP) parallels the 1.5 m air temperature, (T_HR_AVG) and not the actual ground (SOIL_TEMP_5). The actual ground is warmer than the air temperature with few exceptions, contradicting the RGHE notion that the air warms the ground.

https://www.linkedin.com/feed/update/urn:li:activity:6494216722554458112

https://www.linkedin.com/feed/update/urn:li:activity:6457980707988922368

commieBob
Reply to  Nick Schroeder
January 24, 2019 6:51 pm

As you demonstrate, this really is a place where errors are at least as systematic as they are random. The accuracy of a global figure may be improved if you have a lot of data and the errors are precisely random plus a bunch of other constraints. That is absolutely not the case when there are systematic errors. In that case, the accuracy of the global figure is at least as bad as the systematic errors.

It always blows my mind that ‘they’ take data that sometimes isn’t even measured within plus or minus twenty percent and calculate results to within a tenth of a percent.

IMHO, the only way ‘they’ produce plausible looking energy balances is by applying Cook’s constant.

Greg
Reply to  commieBob
January 25, 2019 2:08 am

A quick scan of the paper seems to show that they are not using just ground based data .

The first paragraph on section entitled : Observational data and models” states they are using CERES, which IIRC Willis has already pointed out has a massive 5W/m^2 energy imbalance. ( A net warming imbalance to huge that we know it is wrong because we would already be living the thermageddon.)

So it is unclear what fig 1 is showing us. Are models exaggerating the 5W/m^2 energy imbalance or does that get swept under the carpet an reset to zero before comparing to models. Is that graph regional or global. I really do not know what we are looking at here.

Greg Goodman
Reply to  commieBob
January 25, 2019 2:43 am

The uncertainty in the
solar reflected TOA fluxes from CERES due to uncer-
tainty in absolute calibration is ~2 % (2 − σ), or equiva-
lently 2 Wm−2 . The uncertainty of the outgoing thermal
flux at the TOA as measured by CERES due to calibra-
tion is ~3.7 W m−2 (2 − σ) (Loeb et al. 2009). In the
CERES energy balanced and filled (EBAF) dataset, solar
and thermal TOA fluxes are adjusted within their range
of uncertainty to be consistent with independent esti-
mates of the global heating rate based upon in situ ocean
observations, and are made available on a 1° grid (Loeb
et al. 2009).

So it seems the cumulative 5.7W/m^2 uncertainty has been used to allow “correction” of CERES flux to match ocean observations.

Willis’ post here only concerns the individual land based sites, it seems. So that is a side issue to what he posted about here.

We apply a linear regression
between the model biases and their respective land means
shown in Fig. 12 (significant at the 95 % level). We use the
orthogonal regression method that minimizes the distances
orthogonal to the regression line, in contrast to the standard
least y-squares regression that only minimizes the distances
along the vertical axes.

Wow, at last someone in climatology who is at least aware of how use OLS. Although orthogonal regression is not a panacea since it effectively assumes similar magnitude of errors in both variable, it is big step in the right direction.

Michael S. Kelly, LS, BSA, Ret.
Reply to  commieBob
January 25, 2019 3:57 am

Wow, that’s precariously close to infringing on my own dimensionless number, the Kelly Number (Nk). It is defined as: the right answer divided by the answer I got (the dimension of “answer” cancelling), and though ideally Nk = 1.0, it can take on any real or complex value. Multiplying by the answer I got then in all cases yields the right answer.

It got me through my Masters’ in Mechanical Engineering.

Greg
Reply to  Michael S. Kelly, LS, BSA, Ret.
January 25, 2019 6:21 am

Wow, a frig factor with am imaginary part, pure genius.

If you could expand it to 26 dimensions and apply it to string theory you could probably save modern physics and win a Nobel prize.

commieBob
Reply to  Greg
January 25, 2019 7:25 am

I think, but can’t prove, that these guys beat you to it.

DD More
Reply to  Michael S. Kelly, LS, BSA, Ret.
January 25, 2019 3:06 pm

So the Nk is not to be confused with Flannegan’s Finangling Fudge Factor?

From my high school math teacher’s post board –
Flannegan’s Finangling Fudge Factor – “That quantity which, when multiplied by, divided by, added to, or subtracted from the answer you get, gives you the answer you should have gotten.”

Also known as SKINNER’S CONSTANT
Only successfully used when the correct answer is known.

John
Reply to  commieBob
January 25, 2019 5:54 pm

Commie Bob, how do they explain tropical caps and ice ages, no humans then. It’s a scam that man has any influence on climate.

Ed Bo
Reply to  Nick Schroeder
January 25, 2019 12:44 pm

Nick:

You’ve been told time after time that your definitions and analysis are completely wrong. Yet you keep repeating the same fundamental mistakes again and again.

Your definition of emissivity is ridiculously wrong. Any textbook or reference will explain that emissivity is the ratio of thermal radiation actually emitted to the maximum possible (“blackbody”) thermal radiation that could be emitted at that temperature. Period.

It has nothing to do with conductive or convective transfers that may or may not be occurring at the same time, as you wrongly assert. You have been challenged before to find any reference that uses your definition, and you have not done so.

You claim to have formally studied these topics, but you continually get the most basic concepts of thermodynamics and heat transfer totally wrong. The pink unicorn brigade at PSI may fall for your fantasies, but not here.

Reply to  Ed Bo
January 25, 2019 6:43 pm

That’s right Ed. Emissivity is for radiation. Thermal conductivity is for conduction. Coefficient of heat convection is for convection (this is used in engineering but not much in meteorology)

Moreover, the vacuum does not have emissivity because it does not radiate. Or theoretically the vacuum field radiates according to quantum mechanics but this is Hawking radiation due to gravitational field

Dan DaSilva
January 24, 2019 6:21 pm

Does wattsupwiththat have virus? Why is Google Safe Browsing giving a warning to enter site? Is this real or a trick?

Latitude
Reply to  Dan DaSilva
January 24, 2019 6:31 pm

does it direct you to Skeptical Science?

Dan DaSilva
Reply to  Latitude
January 24, 2019 6:34 pm

Ha, ha

Frank Mansillas
Reply to  Dan DaSilva
January 24, 2019 6:54 pm

Same happened to me, thought I had malware.

Reply to  Frank Mansillas
January 24, 2019 8:07 pm

This is coming from an embedded advertisement. I’ve turned off an add on the right sidebar. Please advise if you see it again.

Reply to  Dan DaSilva
January 24, 2019 7:04 pm

I assume you are using Chrome?

I quit using Chrome over 2 years ago because of Alphabet’s/Google’s policies, hidden search bias, and URL re-direct manipulations.

Amazingly, MS’s Bing and MS Explorer are coming back into vogue again because of some shady shit that Google is doing.

Dan DaSilva
Reply to  Joel O'Bryan
January 24, 2019 7:28 pm

Yes, Chrome. Thanks

RicDre
Reply to  Joel O'Bryan
January 24, 2019 7:49 pm

“I quit using Chrome over 2 years ago …”

I quit using Chrome recently because I happened to be watching the data Sent and Received numbers on my internet link and noticed that Chrome was sending an unusual amount of data somewhere. When I went back to Internet Explorer the problem disappeared so I deleted Chrome.

Ian W
Reply to  RicDre
January 25, 2019 7:26 am

You may need to avoid MS browsers in future too, as I have read that the plan by MS is to use Chrome as the engine for Edge replacements.

https://www.theverge.com/2018/12/4/18125238/microsoft-chrome-browser-windows-10-edge-chromium

RicDre
Reply to  Dan DaSilva
January 24, 2019 7:41 pm

“Does wattsupwiththat have virus?”

I think Griff was having a similar problem. Griff, if you are out there, did you ever resolve the problem you were having with getting a warning from Norton about WUWT having a virus?

Patrick MJD
Reply to  Dan DaSilva
January 24, 2019 7:46 pm

I was getting the same issue, but seems to have “gone” now.

Hivemind
Reply to  Dan DaSilva
January 25, 2019 12:33 am

I get the same thing on my laptop with Firefox trying to access DuckDuckGo.

Jaap Titulaer
Reply to  Dan DaSilva
January 25, 2019 1:35 am

There is a new MS thingy for which they hired some lefty external company. That rates sites as to whether they are left or right and anything to the right of center get’s flagged as BAD.
Propaganda at its best, Dr. G. would be proud.
I kid you not.

Carbon Bigfoot
Reply to  Dan DaSilva
January 25, 2019 4:09 am

Periodically I get the “Can’t Display This Page” BS—in fact this AM. I know that clicking the fix button will resole the issue but new folks probably move on.

Editor
January 24, 2019 6:35 pm

Willis, thank you for finding and commenting on Wild et al. 2014.

Regards,

Bob

Terry Jay
January 24, 2019 6:44 pm

The old Garbage in, Gospel out ploy

Steven Mosher
January 24, 2019 6:50 pm

“square metre or so, they claim that this is due to increasing CO2 and it is going to slowly heat up the earth over the next century.”

no they dont argue it is due to co2.

it is due to the sume of all net forcing, including c02 and other GHGs.

you cant attack a theory you mispresent.

basics

Reply to  Steven Mosher
January 24, 2019 7:29 pm

Steven,

When will you learn. The ONLY forcing influence is the Sun. Anything else is represented by the change in solar forcing that has the same effect on the surface temperature as some change to the system keeping the solar forcing constant.

The total solar forcing is 240 W/m^2 which results in about 390 W/m^2 of net average surface emissions representative of the average temperature. Assuming all Joules are equivalent, this is about 1.62 W/m^2 of surface emissions per W/m^2 of forcing. It’s unambiguously clear that the next W/m^2 of solar input (or the last one) can not have changed the surface emissions by 4.3 W/m^2 as consequential to the presumed 0.8C temperature change. The only possible sensitivity is about 0.3C per W/m^2 which is less than the lower limit presumed by the IPCC!

Feedback doesn’t amplify the Planck sensitivity as the ‘consensus’ incorrectly presumes. The best you can claim is that feedback increases the power sensitivity of the surface from 1 W/m^2 of surface emissions per W/m^2 of forcing e=1 (ideal black body) up to the 1.62 W/m^2 observed for the Earth (e=0.62).

You still haven’t answered the question about how the planet can tell the next Joule (or the last one) from all the others so that it can be so much more powerful at warming (or cooling) the surface than any other?

Unless you have a coherent answer to this question, you simply can not support any ECS greater than about 0.35C per W/m^2.

Reply to  co2isnotevil
January 24, 2019 8:44 pm

Even Christopher Monckton has gone along lately with feedbacks being slightly net positive. One of the positive feedbacks is the surface albedo feedback, whose magnitude varies with global temperature. I see it as having been more strongly positive during the comings and goings of ice age glaciations, at times even making global climate unstable until it changes quickly to a state that has some stability. And I see the surface albedo feedback as less strongly positive during the interglacial periods.

Note recent studies by Nick Lewis, and Lewis & Curry, supporting ECS being around 1.5 degrees C/K per 2xCO2, which is about .4 degree C/K per W/m^2.

Reply to  Donald L. Klipstein
January 24, 2019 9:50 pm

Donald,

The concept of feedback, relative to the way it was applied to the climate system, is irrelevant. Bode’s linear feedback amplifier model that it was based on has no correspondence to the actual climate system.

To the extent that the surface is 620 mw per W/m^2 warmer than it would be based on the forcing alone indicates an apparent gain > 1, while the open loop gain (i.e. the gain of an ideal BB) is exactly 1, so the net ‘feedback’ must be positive. However; since the open loop gain is only 1 and there is no source of output Joules beyond the forcing Joules, positive feedback has no possibility of causing runaway or any of the other scary things often associated with positive feedback.

David Stone
Reply to  co2isnotevil
January 25, 2019 2:54 am

In fact the important point is that positive feedback in an amplifier depends on there being an external power supply (energy source) to provide the amplified output! All of these models which assume “gain” cannot work unless the external source of energy is defined, which it is not, they just assume that it is free. An amplifier (gain) is also inefficient in the process, so the energy input is always more than the sum of the input and output power.

Reply to  co2isnotevil
January 25, 2019 8:05 am

A fact that Monckton refuses to address in connection with his model.

Reply to  co2isnotevil
January 25, 2019 8:58 pm

Positive feedback doesn’t necessarily cause runaway or other scary things, that requires positive feedback sufficient to make gain infinite. Then again, the climate did become unstable at times during ice sheet advances and retreats when such advances and retreats made major changes in surface absorption of sunlight.

As for power supply to an amplifier: Consider the tunnel diode oscillator, and an amplifier that can be made with a similar circuit using a tunnel diode. Sunlight is the analog of the power supply in a tunnel diode oscillator. (Yes, I know the tunnel diode is an obsolete component, but they were made and they are great examples for amplifier theory.)

As for Christopher Monckton’s slightly positive feedback: It is a serious oversimplification that does not consider global temperature feedback varying with global temperature, and his oversimplification does not consider albedo feedbacks; those would add to the positivity he found. Notably, Christopher Monckton seems to like finding climate feedbacks being more-negative, according to how I see his postings to WUWT. In a similar manner, he even claimed multiple times that The Pause started in 1996.

Reply to  co2isnotevil
January 25, 2019 9:40 pm

Phil.,
Monckton is trying to explain things within the context of what the IPCC’s self serving consensus claims. My position is the the IPCC is so wrong, it’s embarrassing and there’s no sense in trying to be consistent with what they claim.

Alan D. McIntire
Reply to  Donald L. Klipstein
January 25, 2019 11:12 am

As opposed to the REAL world, where feedbacks are NEGATIVE.
See figure 2 here:

http://www-eaps.mit.edu/faculty/lindzen/235-LindzenGRL.pdf

As the “Sesame Street” song goes, “One of these things is not like the others..” And that’s the graph based on the real world rather than models.

Reply to  co2isnotevil
January 26, 2019 2:37 am

Ha ha, Steve Mosher is struggling here! THIS is all he could come up with? A weak technicality? :

“…no they don’t argue it is due to co2.
it is due to the sum of all net forcing, including CO2 and other GHGs.
You can’t attack a theory you misrepresent…”

RicDre
Reply to  Steven Mosher
January 24, 2019 7:37 pm

Mr. Mosher said: “it is due to the sume of all net forcing, including c02 and other GHGs”

Ok, that makes sense. So approximately what percentage of the non-natural atmospheric warming does the theory say is the result of CO2 emitted into the atmosphere by human activity?

MarkW
Reply to  Steven Mosher
January 24, 2019 7:40 pm

You are misrepresenting how much misrepresentation is going on.
The claim has always been that any changes in other GHGs is due directly to changes in CO2.

PS: It really is funny for you, of all people to whine about others misrepresenting the arguments of those they are disagreeing with, in light of the whoppers you were telling this morning.

Dave Fair
Reply to  Willis Eschenbach
January 24, 2019 9:19 pm

+ a lot, then more alot, then we go back to a lot.

Reply to  Steven Mosher
January 25, 2019 4:18 am

@Steven – regardless of the replies below, you know quite well that you are “misrepresenting the theory.” Hypothesis, actually, and a disproven one, that forcing by increased CO2 concentration raises the temperature, which raises the other GHG (primarily H2O) concentrations, which further raises the temperature in a high-gain (which is where the hypothesis is wrong) positive feedback.

No serious poster or commenter here “misrepresents” your hypothesis – we merely abbreviate it.

tom0mason
Reply to  Steven Mosher
January 26, 2019 11:32 am

By their ideas all this ‘extra’ solar energy stays as heat.
What utter prefossilized coprolite!

Nature in all its abundance would be hastily transferring most of that ‘extra’ solar energy for chemical energy in organic bonds, especially as the atmospheric CO2 level rises.
Energy balance is complete BS a myth, something to tell aspiring ‘climate modelers’.
This energy balance is a moving target …
This planet naturally takes in the solar energy, and just as naturally eject some — it does NOT have to balance at any moment!
The ‘balance’ is a dynamic dependent on the conditions both on the planet (all those weather/climate variables, and the prevailing overall chemistry in the oceans, land, and atmosphere), and what the sun emits and all the changes over time.

To view it any other way is to absurdly reduce all this planets many natural processes to nought.

David Young
January 24, 2019 6:52 pm

Willis, I think the models are generally tuned to get the top of atmosphere radiation imbalance correct (at least in so far as we can measure it). That is probably achieved by arranging cancellation of much larger errors as you show here. That doesn’t build any confidence for me however.

Reply to  David Young
January 24, 2019 7:10 pm

As reported in various journal admissions, when most modellers try to close the TOA energy imbalance in their model, it runs way, way too hot, something like 6K to 11 K per CO2 doubling… way too hot. So they have to let the model’s energy imbalances exist based on their subjective judgements (hand tuning puppetry). They also tune in higher than observed amounts of aerosols (for cooling) for 20th Century calibration runs as well — Something Trenberth has lamented about.

David Young
Reply to  Joel O'Bryan
January 24, 2019 8:11 pm

Joel, Do you have a reference for the TOA imbalance point?

Joel O'Bryan
Reply to  David Young
January 24, 2019 9:00 pm

Yes David,
Start down that Climate Rabbit Hole with:

Climate scientists open up their black boxes to scrutiny
URL:
http://science.sciencemag.org/content/354/6311/401

“Recently, while preparing for the new model comparisons, MPIM modelers got another chance to demonstrate their commitment to transparency. They knew that the latest version of their model had bugs that meant too much energy was leaking into space. After a year spent plugging holes and fixing it, the modelers ran a test and discovered something disturbing: The model was now overheating. Its climate sensitivity—the amount the world will warm under an immediate doubling of carbon dioxide concentrations from preindustrial levels—had shot up from 3.5°C in the old version to 7°C, an implausibly high jump.

MPIM hadn’t tuned for sensitivity before— it was a point of pride—but they had to get
that number down. Thorsten Mauritsen, who helps lead their tuning work, says he
tried tinkering with the parameter that controlled how fast fresh air mixes into clouds.

Increasing it began to ratchet the sensitivity back down. “The model we produced with
7° was a damn good model,” Mauritsen says.

Did you get that?: “7° was a damn good model” !!!! LOL!! It literally doesn’t get any more junk science than that. The climateers are very right in that exposing their junk models would lead to ridicule. Because they know what they do ***IS*** junk science, but it pays the bills and brings in the paychecks. So they keep doing it. It really is all they can do.

Then (if you are brave enough to handle the ugly truth) proceed to:
“THE ART AND SCIENCE OF CLIMATE MODEL TUNING”
https://journals.ametsoc.org/doi/10.1175/BAMS-D-15-00135.1

Go to their Figure 2. in that article (and text therein). The Ice crystal fall velocity ( a parameter hand-tuned in the models that use it) shows that even small tweaks (of a very unconstrained by observation parameter) greatly affects the TOA energy balance outcome in the model (GFDL CM3) they highlight.

If you can get through that BAMS article on climate modeling in its entirety and still believe Climate Modeling has anything to do with objective science…. then you are a True Believer and beyond help.

Greg Goodman
Reply to  Joel O'Bryan
January 25, 2019 3:00 am

Thanks for that info Joel. It’s worse than we thought.

There is also Hansen et al 2002 , which states quite openly that you can get whatever climate sensitivity you wish by your choice of values for unconstrained parameters.

There is a link in my article on tropical climate sensitivity:
https://judithcurry.com/2015/02/06/on-determination-of-tropical-feedbacks/

I find sensitivity to volcanic events has been exaggerated by tweaking the scaling of AOD to W/m^2 . My result agrees with earlier physics based estimations by the same group which were abandoned to arbitrarily values in an attempt to reconcile model outputs with the climate record.

Reply to  David Young
January 24, 2019 9:19 pm

Yes.
My response is long and stuck somewhere in moderation (likely because of html formats and URL’s for your verification). So as to demonstrate I’m not making this stuff up.
Climate Models are junk science.
Be patient. Check back here sometime in the next 2 days. Maybe it’ll get through.

January 24, 2019 7:00 pm

The very fact that the Climate Model Intercomparison Project exists at all as the accepted validation method for the models (an intercomparison of models), rather than comparison to observation for validation, should clue even the dimmest of bulb science-engineering college majors that climate modelling today is Cargo Cult Science, (a.k.a. junk science).

And then to take the ensemble mean of all the junk outputs and try to apply a confidence interval to that mean … well that defies any hope of actually knowing anything at all about our climate and CO2.

DMA
January 24, 2019 7:15 pm

This miscalculation by the models is similar to the cloud formation uncertainties that were the basis for Pat Frank’s error propagation analysis. It looked like good work to me based on my scant knowledge of measurement theory and error analysis but he never found a journal that would print it. The result was, with the known uncertainty in the initialization the uncertainty propagated very rapidly and rendered the results completely unreliable.

Darcy
January 24, 2019 7:21 pm

Curious as to why the IPCC does not cull out models that cannot replicate the basic driving parameters of the system being modeled versus quoting an average of the output. Or even better require model update until a match is achieved (within a range that is small enough to not effect the parameter variation umder study). Model verification/validation 101. Such is required and SOP for robust decision making in the DoD and commercial world with $ an order of magnitude lower than the investments being called for.

Dave Fair
Reply to  Darcy
January 24, 2019 9:37 pm

The CMIP3 models vastly underestimated the reduction in Arctic ice. In CMIP5 they tried to get greater ice reductions, and succeeded to a limited extent. CMIP5 Arctic ice was still too high, but the attempt to get it right negatively affected other aspects of the models, making those metrics more inaccurate.

Additionally, every effort to downscale model results to regional hindcast/projection has been unsuccessful. They get the different land regions and individual ocean basins/sub-basins grossly wrong. They get some global success through tuning to the late 20th Century. Otherwise, they resemble shotgun blasts in their accuracy.

Dave Fair
Reply to  Darcy
January 24, 2019 9:45 pm

We cannot measure the net energy balance to the accuracy required to verify the model’s assumptions/output. IRC, the net flow increase was calculated at about 1% +/- 10% of the total energy flows. Cargo Cult Science.

icisil
Reply to  Darcy
January 25, 2019 5:28 am

Because they are not interested in accuracy. Imagine if they did hurricane forecasting like they do CMIP; it would be useless.

MarkW
Reply to  icisil
January 25, 2019 9:33 am

The really bad models are needed to pull the average of the ensemble in the direction they need it to go.

bit chilly
Reply to  Darcy
January 25, 2019 6:37 pm

Darcy,if they did that they wouldn’t have any models to keep the funding coming in.

Mr.
January 24, 2019 7:24 pm

Maybe the solar detectors were facing the wrong way Wills?
After all, for eight long years it was shining out of Barack’s ar5e.

January 24, 2019 7:37 pm

The graph comparing the models regarding solar SW at the surface doesn’t show the uncertainties of the measurements. In Wild’s paper he mentions 184 W/m2 with errors ranging from 5 to 10 W/m2. In AR5 WG1, the comparable estimate is 176 to 192 W/m2, (+/- 8 W/m2) combining SW absorbed and SW reflected at the surface. So even when a model correctly approximates the measurement central tendency, they are still looking for a change in energy flux amounting to a fraction of the measurement errors. The diversity of the models is one problem, measurement noise is perhaps worse.

Matthew R Marler
January 24, 2019 7:42 pm

Thanks for the essay and the link to the paper.

Not to be missed, my favorites: Considerable uncertainties remain also in the magnitudes
of the non-radiative fluxes of sensible and latent heat and
their partitioning over both land and oceans.

Patrick MJD
January 24, 2019 7:50 pm

“But when someone says “Our latest computer model, which contains over a hundred thousand lines of code and requires a supercomputer to run it, says that in the year 2100 temperatures will be two degrees warmer than today”, people scrape and bow and make public policy based on what is nothing more than the physical manifestation of the programmers’ same assumptions, claims, prejudices, and scientific (mis)understandings made solid.”

That’s because most people, and especially politicians and people in positions of power and influence, do not know how computers work, let alone how to program them. Computer modelling is used almost universally and accepted as Gospel.

ChasTas
January 24, 2019 8:22 pm

“All models are wrong, some are useful”

Dave Fair
Reply to  ChasTas
January 24, 2019 9:51 pm

“Useful” as in supporting more government grants and crony, rent seeking handouts?

MarkW
Reply to  Dave Fair
January 25, 2019 9:34 am

Useful as in making short range weather forecasts, helping you design airplanes, or electronic circuits, keep the power grid stable, etc.

MarkW
Reply to  Dave Fair
January 25, 2019 9:36 am

Even the GCMs “could” be useful if they were being used for their original purpose of helping scientists understand what it is they don’t know regarding weather and climate.

whiten
Reply to  MarkW
January 25, 2019 12:17 pm

Exactly…

January 24, 2019 8:52 pm

Is there any reason given in the subject paper why it reviewed 43 climate models and not all (what 105?) of them?

January 24, 2019 9:18 pm

Is there separate Energy Budget charts for Daytime (12 hours) and Nighttime(12 hours) similar to the Kiehl Trenberth Energy Budget chart ?

January 24, 2019 9:20 pm

Regarding “say half a watt per square metre or so” imbalance, said also as “So their predictions of an impending Thermageddon are based on half a watt or a watt [I assume this is per square meter] of imbalance in global incoming and outgoing energy”, with regard also to “average error at the surface is seven watts per square metre”:

The small imbalance figure of .5-1 watt per square meter, multiplied by actual ECS climate sensitivity in terms of degrees per W/m^2, is how much future global warming we will have if we freeze atmospheric concentrations of all GHGs other than water vapor to where they are now.

The larger error of “average error at the surface is seven watts per square metre” appears to me as an error of income of over 200 W/m^2 from sunlight alone, or of nearly 400 W/m^2 of surface radiation income from sunlight/daylight plus IR radiated downward from greenhouse gases and clouds, sounds to me as probably close to the watts/m^2 errors in radiation outgo in the cited climate models. I suspect the error these models have for income minus outgo is on average income exceeding outgo by about 1 watt per m^2 than is actually the case.

Lasse
January 25, 2019 12:23 am

10% more sunshine since 1983 (SMHI)
Less clouds and more sun hours.

Could it be solar brightening due to less sulfur in the air?
https://journals.ametsoc.org/doi/pdf/10.1175/BAMS-D-11-00074.1

Jaap Titulaer
Reply to  Lasse
January 25, 2019 2:11 am

Ah no.
Sulfur reductions did not even start in most of the EU until early 90’s (after the wall). Eastern block used a lot of Brown Coal and had zero measurements in place to reduce sulfur exhaust. And they only started after large money grants from EU.
Only a few countries (W-EU) did anything in the 80’s, Portugal and Spain had just joined, so they had other things to focus on. Most of the South (except maybe France) did not even start until this millennium, same for most of the Eastern Block.
China did not do anything positive (have they even now?), Chinese sulfur has only been on the increase IMHO.
Japan was early as were some W-EU and Nordics. But that is really a drop in a bucket compared to everything in between.
So you could only notice the reductions in sulfur in Eurasia sometime well after 2005.

The re-greening of the European forests was mostly driven by increased CO2, not by timely reductions in sulfur, that came later. Woods in Czech or Poland where already getting greener when the pH content in their rain was still high enough to damage things like marble statues.

Greg Goodman
Reply to  Jaap Titulaer
January 25, 2019 3:11 am

The brightening happened in the stratosphere , in the decades after El Chichon and Mt. Pinatubo.

This is reflected in cooling of the lower stratosphere:
comment image

After the initial TLS warming ( cf surface cooling ) both events were followed by a persistent drop of 0.5 deg C in TLS. This indicates a more transparent stratosphere and more solar energy reaching the lower climate system.

Lasse
Reply to  Jaap Titulaer
January 25, 2019 6:02 am

Still we, in Sweden at least, have less clouds since 1983. 10% more sunshine (W/sqm)

Reply to  Lasse
January 25, 2019 9:35 pm

One thing I see in what IPCC reports of climate models is high positivity of water vapor feedback around or even slightly over that expected from constant relative humidity, along with positivity of cloud albedo feedback.

The way I see these things, global warming is making clouds more effective/efficient at moving heat (and associated upward moving air), which means that a warmer world has less coverage by clouds and updrafts as these get more efficient, and greater coverage by downdrafting clear air. I see this as meaning that the water vapor positive feedback is less positive than according to constant relative humidity, and I suspect the combined positive feedbacks from water vapor and cloud albedo are limited to what the water vapor feedback alone would be if clouds were frozen/unchanging in their coverage and ability to move heat.

January 25, 2019 12:24 am

But when someone says “Our latest computer model…

I noticed this a long time ago when personal computers first started to make their appearance. Brash young business grads with a bit of savvy started showing up with pretty graphs and pie charts, say it was from a computer program and senior management of major corporations would swoon. After a while the people with real life experience who knew what they were doing either retired or learned to make pretty pictures of their own and level the playing field. But the gullibility of experienced senior management because the pie chart was from a computer, with no thought at all to the validity of the data, was astounding… and led to a lot of really bad decisions.

peterg
January 25, 2019 12:59 am

Some computer models are very accurate. Spice models electronic circuits quite successfully. So if you are developing a circuit, first you run it in spice if at all possible, then breadboard it, then develop the pcb etc.

A model that is the sum of accurate sub-models has a possibility of being accurate. However a model that is the sum of inaccurate sub-models has no chance.

A computer model in the end is only a calculation. The calculation may be correct or incorrect. Until verified experimentally, it is only an hypothesis.

MarkW
Reply to  peterg
January 25, 2019 9:42 am

I believe the point is that you don’t go from a SPICE run directly to production. SPICE is used to narrow down which designs you want to take to breadboarding etc.

The GCMs are trying to go directly from model runs to full scale production.

tom0mason
Reply to  peterg
January 26, 2019 12:13 pm

However the parameters used in SPICE are relative few and very well understood.
Not quite the same with climate models where there are a huge number of interacting parameters that are not all well understood.

A C Osborn
January 25, 2019 1:54 am

Mr Eschenbach, is this database available to the general public?

Greg
January 25, 2019 3:23 am

There is a link for BRSN in the paper but it seems that you have to apply for ftp read access and I have not looked into the conditions. It does not appear to be openly available. There does not seem to be a link GEBA, maintained at ETH Zurich. Swiss are not generally that open either.

https://bsrn.awi.de/data/data-retrieval-via-ftp/

meiggs
January 25, 2019 4:18 am

As one who does power plant energy balances I can tell you for certain that once one resorts to “tuning” a model it means they lack the ability grasp and simulate a simple system. A good model can make accurate predictions about off design modes of operation. A “tuned” model cannot, period. Meanwhile, none of the quantum folks have dreamt up a way to directly measure LWIR??

http://www.nusod.org/2016/nusod16paper20.pdf

icisil
January 25, 2019 4:46 am

“…take any old load of rubbish and run it through a computer, when it comes out the other end, there will be lots of folks who will believe it is absolutely true.”

Sounds like science crime syndicate money laundering.

Steve O
January 25, 2019 4:50 am

It seems like a stretch to say that you can be confident about an effect of 0.5/ m2 if you have an imbalance of ten times that. I’d be interested in hearing what the scientists say and why they believe that they’ve overcome that.

Verified by MonsterInsights