Guest Post by Willis Eschenbach
There is a more global restatement of Murphy’s Law which says “Nature always sides with the hidden flaw”. Parasitic losses are an example of that law at work.
In any heat engine, either natural or manmade, there are what are called “parasitic losses”. These are losses that tend to reduce the temperature differentials in the heat engine, and thus reduce the overall efficiency of the engine. In general, as a percentage parasitic losses increase rapidly with ∆T, the temperature differences in the engine. In the climate system, two main parasitic losses are the losses from the surface to the atmosphere by way of conduction and convection (sensible heat), and the losses from surface to atmosphere by way of evaporation and transpiration (latent heat). Both of these parasitic losses act to reduce the surface temperature with respect to the overlying atmosphere, by simultaneously cooling the surface and warming the atmosphere … nature siding with the hidden flaw to reduce the overall system efficiency. So I decided to see what the CERES data says about parasitic losses. Figure 1 shows the parasitic losses (the sum of sensible and latent heat losses), as a percentage of the total surface input (downwelling longwave plus shortwave).
Figure 1. Parasitic losses (latent and sensible heat loss) from the surface to the atmosphere. Percentage of parasitic loss is calculated as the sum of sensible and latent loss, divided by the total surface input (downwelling shortwave plus downwelling longwave).
I was most interested in how much the parasitic loss changes when the total surface input increases. Figures 2 to 4 shows that situation:


Figures 2-4. Scatterplots, parasitic loss in watts per square metre (W/m2) versus total surface input (W/m2). Parasitic loss is loss as sensible and latent heat. Gold line shows the loess smooth of the data. Red dots show land gridcells, which are one degree square (1°x1°) in size. Blue dots show ocean gridcells.
I was very encouraged by finding this result. I’ve written before about how at the warm end of the spectrum, parasitic losses would increase to the point where most of each new additional watt striking the surface would be lost as sensible and latent heat, and that little of it would remain to warm the surface. These graphs bear that out entirely. Here’s why.
The slope of the gold line above is the rate of increase in parasitic loss for each additional degree of warming. As you can see, the slope of the line increases from left to right, although the rate of increase goes up and down.
In order to understand the changes, I took the slope (change in parasitic loss divided by the corresponding change in surface input) at each point along the length of the gold line for both the land and the ocean separately. Figure 5 shows that result.
Figure 5. Change in parasitic loss (in W/m2) for each additional W/m2 of surface input. “Wobbles”, the looped parts in the two graphed lines reflect subtle changes in the loess smooth, and can be ignored.
Now, what are we looking at here? Well, this is how the parasitic loss changes as more and more energy is input to the surface. Where there is little surface input, the loss is low. In fact, at the South Pole the situation is reversed, and the net flow of energy is from the atmosphere to the surface. This is the result of huge amounts of energy being imported from the tropics.
The key point, however, is that as we add more and more energy to a given gridcell the amount of parasitic losses rises, in perfect accordance with nature siding with the hidden flaw. And at the right hand end of the scale, the warmest end, for every additional watt that is added, you lose a watt …
Is this relationship shown in Figure 5 entirely accurate? Of course not, the vagaries of the smoothing process guarantee that it isn’t a precise measure.
But it clearly establishes what I’ve been saying for a while, which is that parasitic loss is a function of temperature, and that at the top end of the scale, the marginal losses are quite large, close to 100%.
Now, as you can see, nowhere is the parasitic loss more than about 30% … but the important finding is that the marginal loss, the loss due to each additional watt of energy gain, is around 100% at the warm end of the planet. Here is the parasitic loss for the planet as a whole versus total surface input as shown in Figure 2:
Figure 6. Change in parasitic loss (in W/m2) for each additional W/m2 of surface input, as in Figure 5, but for the planet as a whole.Change in parasitic loss (in W/m2) for each additional W/m2 of surface input. “Wobbles”, the looped parts in the two graphed lines reflect subtle changes in the loess smooth, and can be ignored.
Note also that across the main part of the range, which is to say in most of the planet except the tropics and poles, about half of each additional watt of energy increase doesn’t warm the surface … it simply goes into parasitic loss that cools the surface and warms the atmosphere.
Best to all,
w.
PS—If you disagree with what I’ve said please quote my words. That lets all of us know just exactly what you disagree with …
I have a question that I need to clarify in my mind.
On a clear sky, no haze or visible clouds but with the air at say 60% humidity: is the water vapor still a greenhouse gas? It would would seem to be as the water is still there.
Willis, rgb,
The radiative (LW only) heat transfer at the Earth’s surface, which cools the surface, equals LW↑ + LW↓ = LW. It is about 60 W/m2 (~ 395 minus 335) in global and annual average, according to the various Earth’s surface energy budgets. You have to take the net radiative heat exchange, because that is what carries away the energy (or heat) from the surface. The evaporation is ~80 W/m2 and convection ~20 W/m2. These 3 are balanced by the solar input of ~160 W/m2. Of course, it makes no difference mathematically to view LW↓ as an input and LW↑ as an output, but physically it’s very confusing, misleading and also wrong, IMO. It makes the radiative surface cooling output look larger than it is (395 instead of 60 W/m2), as well as the surface input (495 instead of 160 W/m2).
EDim said:
The evaporation is ~80 W/m2 and convection ~20 W/m2.
Both those are proposed as net cooling effects at the surface but:
I) Water vapour that evaporates also has to condense and rising air cools at the moist adiabatic rate yet descending air warms at the dry adiabatic rate (twice as fast) so I have some doubt as to whether the net effect is of cooling at all.
ii) Adiabatic cooling on ascent is reversed by adiabatic warming on descent so I doubt a net cooling for adiabatic convection.
If surface cooling from those two processes has been overstated then where doe that leave DWIR ?
Curt says, March 27, 2014 at 9:02 pm:
“Heat transfer is a process, not a thing. Nobody (but you, apparently) believes in the 19th century caloric theory of heat transfer anymore. Please get into the 20th century, if not the 21st. Radiative heat transfer is simply the difference between two opposing electromagnetic radiation energy “flows”, as in upwelling vs downwelling. This has been well understood for a hundred years now.”
Then you absolutely don’t get what I’m saying, Curt. Or you do and you’re just deliberately trying to obfuscate it.
I’m not talking about ‘the caloric theory’. That’s simply you trying to degrade my argument (nice rhetorical tactic). I’m talking about HEAT. Heat remains the same phenomenon today. Here’s the modern definition of heat:
“If a block of hot copper is placed in a beaker of cold water, we know from experience that the block of copper cools down and the water warms up until the copper and water reach the same temperature. What causes this decrease in the temperature of the copper and the increase in the temperature of the water? We say that it is the result of the transfer of energy from the copper block to the water. It is from such a transfer of energy that we arrive at a definition of heat.
Heat is defined as the form of energy that is transferred across the boundary of a system at a given temperature to another system (or the surroundings) at a lower temperature by virtue of the temperature difference between the two systems. That is, heat is transferred from the system at the higher temperature to the system at the lower temperature, and the heat transfer occurs solely because of the temperature difference between the two systems.”
“Heat, like work, is a form of energy transfer to or from a system. Therefore, the units for heat, and for any other form of energy as well, are the same as the units for work, or at least are directly proportional to them. In the International System the unit for heat (energy) is the joule.”
You see, Curt, this definition holds also for radiative situations. According to the ‘energy exchange principle’ (which actually originated in the 18th century (Prevost), a very crude and ‘macroscopic’ concept, but which is still applied pretty much unchanged today, even long after it was discovered that the quantum world does not operate in such a ‘macroscopic’ manner – but hey, this appears to be totally fine with Curt) the heat is defined as the NET transfer. But the heat is still the heat. You can’t separate the two (postulated) opposing ‘streams’ of energy that make up the heat. They’re not separate entities. They’re part of one and the same process. One and the same energy (radiation) field. The transfer occurs instantaneously, automatically, spontaneously.
This is why it is COMPLETELY and UTTERLY un-physical to treat the smaller (least intense) of two such ‘streams’ as an independent ‘thing’ and, as a result, treat it like a separate energy INPUT to the object emitting the larger ‘stream’. As if it were a separate addition of energy to the hot system, that is HEAT.
So yes, Curt, heat transfer is a process, not a thing. You’re the ones treating it like a thing. That can allegedly be split up into two separate energy streams.
The spontaneous transfer of energy, the FLOW of energy, between two objects at different temperatures ALWAYS and ONLY pass from hot to cold. Just like wind from high to low pressure (even as individual air molecules fly around in all directions). Just like an electric current from high to low potential (even as individual electrons fly around in all directions). And this holds true even when radiation is involved. The thermodynamic laws apply to radiative heat transfer also, Curt. I know you would like to argue that somehow they don’t really, but I’m afraid they do.
For an atmosphere as a whole it must be the case (if the atmosphere is to be retained) that UWIR and DWIR nets out to zero for PROCESSES OCCURRING WITHIN THE ATMOSPHERE.
That leaves incoming solar energy and outgoing IR to space in balance. It just gets a free pass straight through whilst the atmosphere does its own thermodynamic thing by way of a mix of radiation and conduction and the balance between the two is mediated by convection.
It is atmospheric mass that allows conduction to occur.
Conduction being slower than radiation the surface temperature must rise above that required for the immediate in/out radiative energy exchange predicated by the S-B equation.
Circulation changes prevent differences in conductive capability of molecules destabilising the system Why should the same not apply to differences in radiative capability of molecules ?
eyesonu says:
March 27, 2014 at 9:06 pm
Assuredly.
w.
Edim says:
March 28, 2014 at 3:13 am
The cooling output IS 395, and the surface input IS 495 W/m2, Edim … that’s what “output” and “input” mean. Output means what is leaving the surface. Input means what is entering the surface.
Of course these are different from the net flow of energy … so? Those are in fact the inputs and the outputs, whether you like it or not.
Look, Edim. Is saying I paid $300, and I got a $100 rebate “very confusing, misleading and also wrong” to you?? Because that’s all I’m doing. If you think saying “I paid $300 and got a $100 rebate back” is very confusing and misleading to you, I fear I can’t help you.
Because that is EXACTLY THE SAME as saying that the surface radiated 300 W/m2 and received 100 W/m2 in back radiation.
Please note that either way is correct. You can either deal with net flows or individual flows. However, for a situation of any complexity, in order to do the math you pretty much have to deal with individual flows … and that’s how it’s been done for hundreds of years.
So I truly don’t understand your ceaseless objection to dealing with individual flows. That’s how people do these kinds of calculations, Edim, whether for money or for radiation, so you might as well get used to it or else you’ll be complaining about it for the rest of your life. The world ain’t gonna change how it does the math or discusses the flows just because you don’t happen to like it …
w.
Thanks Willis.
That leads me to the next question. Assume a column of air with a given amount/quantity of water. At the lower/warmer altitude the water vapor is visually transparent while at a little higher altitude pressure/temp decreases causing some of this water to condense to a lliquid or frozen state (clouds). Would the change of state change the overall GHG effect with regards to the entire column of a fixed quantity of water? I guess in more of a simple way of looking at this; would be would there be a difference in the GHG effect if the water (fixed amount) was in the form of a cloud or completely visually transparent?
There are more questions to come but I want to address a single aspect at a time. I’m breaking it down to specifics one step at a time.
My grammer in the above post leaves much to be desired.
Quote: I guess in more of a simple way of looking at this; would be would there be a difference in the GHG effect if the water (fixed amount) was in the form of a cloud or completely visually transparent?
Please disregard the first “would be” in that sentence. Multi-tasking on my end. 🙂
eyesonu asked:
“would be would there be a difference in the GHG effect if the water (fixed amount) was in the form of a cloud or completely visually transparent?”
The basic rule is that the more transparent the atmosphere the greater proportion of solar input reaches the surface.
The S-B equation relies on 100% transparency to a perfect blackbody (the Earth is not a perfect blackbody) and also relies on radiation arriving at the surface being radiated out immediately as can only occur with no atmosphere. Additionally,Earths oceans complicate the situation by introducing considerable variability between energy in and energy out, hence ocean cycles in general and ENSO in particular..
The more solar input reaches the surface the higher the surface temperature can get at any given level of insolation and the more conduction can occur to the mass of the air above. Deserts with dry, descending air above are a good example.
The less solar input reaches the surface the lower the surface temperature can become at any given level of insolation and the less conduction can occur to the mass of the air above. Tropical jungles with lots or water vapour taking energy upward are a good example. They do not attain the surface temperatures of deserts due to the abundant presence of that GHG water vapour. That is because water vapour is lighter than air and so more convection occurs in humid regions rather than high surface temperatures.
Deserts tend to be at higher latitudes than the equatorial jungles due to the Earth’s Hadley cells yet the surface temperatures are higher than at the equator where insolation is greater.
GHGs reduce atmospheric transparency so they must also reduce the amount of solar energy reaching the surface thereby causing a lower surface temperature than otherwise would have been the case and less conduction.
That is the precise opposite of AGW radiative theory. That theory suggests that DWIR from GHGs adds to the solar input to the surface over and above what would have reached the surface beneath an atmosphere as transparent as the one in which they float.
I don’t follow that reasoning.
If GHGs do introduce net DWIR towards the surface then surely that cannot be any greater than the opposite effect of their reduction in atmospheric transparency otherwise one breaches the Laws of Conservation of Energy ?
A cloud, being water vapour condensate (liquid rather than gas) greatly reduces atmospheric transparency and so must have a greater cooling effect on the surface than water vapour.
It all comes down to atmospheric transparency in the end.
AGW theory ignores the adiabatic warming of descending air and so to balance the books needs to propose net downward infra red radiation warming the surface.
That results in double counting.
In any atmosphere, that is retained long term, around a planet, UWIR and DWIR must cancel out to net zero within the atmosphere as a whole.
Any long term discrepancy between DWIR and UWIR within the structure of an atmosphere must result in the loss of that atmosphere.
Atmospheres always settle at a sustainable vertical thermal structure and that is revealed by the lapse rate (or collection of lapse rates) for that particular atmosphere.
GHGs cannot be permitted to permanently upset that vertical thermal structure otherwise the atmosphere will be lost.
Instead, the circulation of air within the atmosphere simply changes via changes in convective overturning to cancel out any potentially destabilising influences such as changes in the radiative or conductive capability of constituent gases.
eyesonu says:
March 28, 2014 at 10:07 am
Water vapor, like the other GHGs, absorbs only part of the upwelling radiation. On average it absorbs about 30% in clear-sky conditions.
Clouds, on the other hand, are basically black bodies w.r.t. longwave radiation. So there is definitely a difference. One absorbs 30%, and the other absorbs 100%.
w.
The point here is that you let the postulated ‘downwelling LW’ component from the cooler atmosphere alone (not in collaboration with other fluxes) raise the temperature of the warmer surface (increase its internal energy) directly (not indirectly) and in absolute terms (not in relative terms). Such an energy flow with such a result is defined in physics as HEAT (or work). And heat in nature does not and cannot go from cold to hot. You seem completely impervious to this simple demonstration that there is clearly something fundamentally wrong (and/or misunderstood) with the traditional ‘Prevost energy exchange principle’.
I have no idea what you are talking about with your directly/indirectly. Flux is flux. Nor do I understand what you are talking about with your introduction of absolute vs relative terms. I was quite clear in my definitions and what is being compared, and Grant Petty is even clearer when he writes out term by term a physics based single layer model and finds the fixed point. In actual fact, when I refer to a single layer perfect SW transmitter, perfect IR absorber atmosphere and the 1.19x surface warming it produces via downwelling radiation from the interpolated perfect absorber layer, I am talking about relative warming, relative precisely to the temperature the surface would have with identical incoming insolation but with no interpolated differentially absorptive layer. Under no circumstances can the energy flow involved be called work, at least not past the molecular level, and at that level there is no such thing as heat as all interactions are reversible.
However, it is your final remark that at last reveals you as a closet Dragonslayer. You somehow think a) heat cannot go from cold to hot; and b) that this observation has anything to do with the process I describe.
I believe that what you mean to say is the second law of thermodynamics has to be satisfied the processes involved — and fortunately, it is enormously simple to prove that it is, especially in the single layer model, which never has net heat transfer from cold to hot in the steady state of an open system between the Sun as one reservoir and outer space as the second reservoir. In fact, in the single layer “steel-greenhouse” limit, proving that the second law is happy is one line long, and one can even compute the net entropy increase of the Universe as the heat/energy flows from Mr. Sun, through Mr. Earth, and out into Ms. Space. Consequently you are simply mistaken in your claim of b) — the fact that net heat does not spontaneously flow from a cold reservoir to a hotter reservoir has nothing to do with whether or not the Greenhouse effect leads to a warmer surface with greenhouse gases in an atmosphere than without it because this never happens in the energy flow.
Now if you want to actually get substantive about this, I’m only going to say to you what I say to Postma when he spouts this sort of crap — show me the equations that prove that the second law is violated by a perfectly valid statement of the first law of thermodynamics that leads one to the 1.19T_0 conclusion.
rgb
Stephen Wilde says:
March 28, 2014 at 12:23 pm
Stephen, I’ve basically given up reading your comments, as they are almost invariably incorrect. Some things I’m unwilling to let slip by, however. This is one of them.
The S-B (Stefan-Boltzmann) equation has nothing to do with “100% transparency” of the atmosphere. It also has nothing to do with how long the radiation might remain in the object before being re-radiated. Here is the whole equation:
Radiation (w/m2) = S-B constant * emissivity * Temperature4
You see anything in there about the transparency of the atmosphere or how long the radiation stays absorbed? … didn’t think so …
Look, Stephen, if you can bring in a reference for either of those claims, we can discuss it. But to just throw out such radical statements without the slightest attempt at support or citation? That’s the very reason I mostly stopped even reading your comments. You just blurt out whatever crazy belief you have in your head at the moment, and you make no attempt to support it with quotes, citations, observations, logic, math or anything but your mouth.
It gets old … here’s another example:
That’s not true in the slightest. It also reveals a profound misunderstanding of the physics. GHGs have almost no effect at all on the transparency of the atmosphere to solar radiation. They simply don’t absorb energy at the wavelengths of visible light. Once again, you’re just running your mouth off, spouting something that is totally untrue without any attempt to back up your BS with anything factual.
It gets old …
w.
Willis said:
“Radiation (w/m2) = S-B constant * emissivity * Temperature4”
Emissivity relates to how long the radiation stays absorbed, doesn’t it ?
100% emissivity means it comes out immediately.
Zero emissivity means it doesn’t come out at all.
and Willis said:
“GHGs have almost no effect at all on the transparency of the atmosphere to solar radiation”
Transparency works two ways.
What matters is not just the transparency to incoming solar radiation. What matters is how much delay the atmosphere introduces compared to direct and immediate in / out radiation.
That delay is a consequence of conduction to the atmospheric mass.
The conductive absorption by mass is what delays the exit of radiation to space.
A purely radiative solution involves no delay since all radiation is at the speed of light.
To get the delay you have to introduce conduction because it is slower than radiation.
Then, to hold the balance between radiation and conduction you need variable convection.
The amount of conductive absorption that can occur is affected by transparency. It doesn’t matter whether the obstruction is on the way in or on the way out.
You can argue that GHGs change the balance of UWIR / DWIR which is fine by me but if that happens then the amount of conductive absorption by the mass of the atmosphere changes too and in an equal and opposite direction.
More DWIR and more conduction, less DWIR and less conduction, either way a net zero effect on surface temperature.
The surface temperature being stabilised by the change in convection as per your very own thermostat hypothesis. The phase changes of water merely assisting the process by making the necessary air circulation changes less violent than would otherwise be necessary.
If it were otherwise then the atmosphere could not be retained.
You need a mechanism like that for your thermostat to work.
If there is a thermostat, such as you propose, then how else do you see the physics working ?
Kristian:
Once again you manage to get things completely backwards. Let’s go through it step by step.
Bodies emit electromagnetic radiation as a function of their temperature and their emissivity at that temperature. (Their overall emissivity is the integral over all wavelengths of the product of the relative emissivity at each wavelength (between 0 and 1) and the blackbody intensity at that wavelength.) The overall emissivity can, and usually does, vary with temperature. Integration can give you the overall amount of power emitted.
It does not matter what the body is radiating towards, and in most cases it is radiating towards many different bodies. In the case of semi-transparent bodies like the atmosphere, each “slice” of the atmosphere is treated as a body.
This has been well understood even down to the level of statistical mechanics and quantum mechanics for about a century now. It is utterly non-controversial.
Next step: If Body A can radiate toward Body B, then Body B can radiate toward Body A, and will do so through the same geometric path. These streams of electromagnetic radiation are, contrary to your assertion, independent in many important aspects. The path between the two objects is not independent.
We also know in a completely uncontroversial sense that two (or more) streams of electromagnetic radiation pass through each other without impeding each other. For purposes of both analysis and measurement, we can look at individual opposing streams.
For example, the two of us are standing some distance apart on a very dark night. I shine a flashlight at you. Willis, standing between us, points a couple of photosensors at my flashlight. One uses thermal effects; the other uses photoelectric effects. (They agree.)
Now you shine a flashlight at me. Willis’ sensors are still pointing toward my flashlight. By your logic, their readings should change. They don’t. The beams of our flashlights pass through each other unimpeded so we can each see each other’s light using the frequency-sensitive radiation sensors in our retinas.
Next step: What about “heat transfer”? Here we need to take into account three important factors. First, as we have already discussed, the path from A to B is the same as from B to A. Second, the blackbody curve for a higher-temperature body is higher at all wavelengths that the blackbody curve for a lower-temperature body. Third, for any substance, the relative (0 to 1) absorptivity is exactly equal to the relative emissivity at every wavelength.
Given these well-known constraints, it is ALWAYS true that more radiant power from the higher-temperature body is absorbed by the lower-temperature body than radiant power from the lower-temperature body is absorbed by the higher-temperature body. The difference is the net power transfer, and if thermalized (as opposed to, say generating current in a photo-electric device), what we call heat transfer.
What we call the “heat transfer flux” (with flux just being the Latin term for flow) is simply a somewhat useful, but definitely limited, metaphor. Despite your protestations, you are definitely treating it as too real a thing, which is why I can say that you are acting too much like the 19th century believers in the caloric theory of heat. (They had no way of knowing better at the time.)
In a fluid flow between two pressure potentials, the bulk flow rate can be determined from the average velocity of the fluid molecules, with a somewhat normal distribution about that average. In radiant transfer between two bodies at different temperature potentials, there are two opposing radiation streams each at the speed of light. (There is an average only in the sense that the average human has one breast and one testicle…) This very different underlying nature permits us to perform different styles of analysis and measurement.
Perhaps a further comment will help Willis and Curt (and others):
If DWIR and UWIR become unbalanced for the global atmosphere as a whole (i.e, if changes in convection fail to eliminate the imbalance) then surface temperature will indeed change but that would be a self correcting event.
i) If DWIR exceeds UWLIR then the surface does indeed warm but then the system radiates more energy to space than the system is receiving and the system cools until surface temperature is back where it ‘should’ be.
ii) If UWIR exceeds DWIR then the surface cools but then the system radiates less energy to space than the system is receiving and the system warms until surface temperature is back where it ‘should’ be.
That is the true thermostat hypothesis.
A double indemnity.
If convection falls short of its job then radiation completes it.
QED
Willis said:
“GHGs have almost no effect at all on the transparency of the atmosphere to solar radiation. They simply don’t absorb energy at the wavelengths of visible light”
All energy passing through the system,including outgoing IR was initially solar radiation . Absorption of outgoing IR affects overall atmospheric transparency as much as absorption of incoming solar shortwave.
Both add to the delay in transmission of solar energy through the Earth system and thus both add to the time available for conduction to occur.
It is the time available for conduction of energy to atmospheric mass that determines surface temperature and then convection ensues to ensure that the radiation / conduction balance does not get out of alignment with radiation in and out.
GHGs may absorb outgoing IR but in doing so they radiate about half of that absorbed out to space.
In contrast, non GHGs have to absorb by conduction and since they cannot radiate out to space that energy must be passed back GHGs for direct radiation out or to the surface via adiabatic warming of descending air before it can be radiated out to space from the surface.
Thus GHGs do have a cooling effect on the surface because less energy needs to be returned to the surface by adiabatic warming on descent. They provide a radiative window to space from within the atmosphere that non GHGs cannot provide.
Convection with GHGs in an atmosphere needs to be less vigorous than convection without GHGs if balance is to be maintained. Add the phase changes of water and even less convective vigour is required due to upward radiation from the higher level condensate.
All that is consistent with, and indeed essential for, your thermostat hypothesis.
Stephen:
You say:
“Emissivity relates to how long the radiation stays absorbed, doesn’t it ?”
No, not at all.
“100% emissivity means it comes out immediately.”
No, it doesn’t. Not even close.
“What matters is how much delay the atmosphere introduces compared to direct and immediate in / out radiation.”
It’s not a question of delay at all.
Get yourself some good thermodynamics and heat transfer textbooks and study them closely, working through a lot of problems. You presently don’t even come close to having the conceptual background to discuss these issues intelligently.
MrEschenberger@ur momisugly 12-50
ended a reply to Stephen Wilde by saying:-
That’s not true in the slightest. It also reveals a profound misunderstanding of the physics. GHGs have almost no effect at all on the transparency of the atmosphere to solar radiation. They simply don’t absorb energy at the wavelengths of visible light. Once again, you’re just running your mouth off, spouting something that is totally untrue without any attempt to back up your BS with anything factual.
It gets old …
w.
So the Magician slips “visible light” out of his sleeve and into his denial of there being notable atmospheric SOLAR HEAT absorption when he knows very well that around (14%) 200W/m2 of Solar IR HEAT is blocked according to reports on line.
The folly of treating backradiation (~335W/m^2) as a “forcing” on equal thermodynamic footing as insolation (~160W/m^2) at the surface is seen in the resulting BB temperature of ~306K. The fact of the matter is that Earth’s surface temperature is by no means the product of radiative processes alone. Nor can it be determined by a simplistic, mass-less radiation balance, wherein thermal energy stored in various reservoirs within the system is represented by oppositely directed LWIR fluxes. Clearly ANY such fluxes with a NET value of ~60W/m^2 will make the TOA power budget balance, but leave the surface temperature undetermined. That’s why advanced models depend upon empirical determinations..
rgbatduke says, March 28, 2014 at 12:48 pm:
”I have no idea what you are talking about with your directly/indirectly. Flux is flux.”
Good. Then we seem to agree: You let your flux down from the atmosphere to the surface heat it directly, that is, not by reducing the outgoing flux (decreased cooling), but by adding to the incoming flux (increased heating). Can’t do that.
”Nor do I understand what you are talking about with your introduction of absolute vs relative terms.”
Relative warming relates to flattening a cooling curve so that at a certain point in time the cooling object (now cooling slower than before) is at a higher temperature than what it would’ve been if the cooling went faster. Relative warming never makes the object hotter than originally.
Absolute warming is simply making an object hotter than what it was before. The surface of the earth would for instance according to you equilibrate at 255K with only solar input (239 W/m^2 IN, 239 W/m^2 OUT). At this point it doesn’t cool and it doesn’t warm. There is no cooling curve to flatten. But then you put your radiatively active atmosphere on top of it and apparently introduce an extra flux to the surface. This flux according to you physically raises the temperature of earth’s surface. It makes it warmer than before.
”In actual fact, when I refer to a single layer perfect SW transmitter, perfect IR absorber atmosphere and the 1.19x surface warming it produces via downwelling radiation from the interpolated perfect absorber layer, I am talking about relative warming, relative precisely to the temperature the surface would have with identical incoming insolation but with no interpolated differentially absorptive layer.”
That is not relative warming, Robert. That is absolute warming. You’re not letting your postulated ‘downwelling radiation’ reduce cooling. You’re letting it increase heating. You’re not subtracting. You’re adding. You say so yourself.
”You somehow think a) heat cannot go from cold to hot”
It can’t. In nature.
”b) that this observation has anything to do with the process I describe.”
It does. Because you expect your ‘downwelling radiation’ to produce a result exactly as if it were a heat flux. You don’t SAY it. But you DO it.
”I believe that what you mean to say is the second law of thermodynamics has to be satisfied the processes involved — and fortunately, it is enormously simple to prove that it is, especially in the single layer model, which never has net heat transfer from cold to hot in the steady state of an open system between the Sun as one reservoir and outer space as the second reservoir.”
You refuse to address what I’m pointing to. You refuse to see it. It’s not hard. In fact, it’s ‘enormously simple’. If the 2nd Law is to be satisfied then you will HAVE TO treat the alleged exchange of energy between the surface and the atmosphere as ONE process, the NET FLOW. You can’t just ignore the one part and treat the other is if it were a heat flux unto itself. That’s what you do.
You hope to circumvent what the 2nd Law clearly states by first expecting the one ‘stream’ of energy in a spontaneous NET energy exchange, the one going from cold to hot, to give a result AS IF IT itself were a net flow, that is, HEAT, meaning raising the temperature in absolute terms of the object receiving it.
Then, when this is pointed out to you, then all of a sudden you appeal to the NET concept. But this is simply the nonsensical ‘HEAT flows both ways, only more from hot to cold than from cold to hot’-argument.
The sun doesn’t help you, because the sun has no part in the EXTRA heating of the surface of the earth, the part going from 255 to 288K. The flux in from the sun is not increased. The resulting flux going out from the surface is not reduced, not obstructed in its free escape at all. There is ONLY the extra flux down from the atmosphere. IT does the extra heating. As if it were a second HEAT flux to the surface. Making the atmosphere a heat source to its own heat source, the surface. Or, in reality, making the surface, through recycling of emitted energy, its OWN heat source.
You’re breaking both the 1st and the 2nd Laws, Robert.
”Consequently you are simply mistaken in your claim of b) — the fact that net heat does not spontaneously flow from a cold reservoir to a hotter reservoir has nothing to do with whether or not the Greenhouse effect leads to a warmer surface with greenhouse gases in an atmosphere than without it because this never happens in the energy flow.”
I’m not mistaken in my ‘claim’ b). You don’t understand (or you don’t want to understand) what I’m getting at. You make energy from the cold reservoir of the surface make the surface, the hot reservoir of the atmosphere, even warmer. That doesn’t work in the real world. Because that is a transfer of HEAT from cold to hot.
What you have to do is reduce the energy flow actually leaving the hotter object. You’re not doing that. You increase it rather by adding more incoming. Reducing the outgoing will obey the 2nd Law. And this is done by decreasing the temperature difference (gradient) between the two objects at different temperatures. You know, like the radiative heat transfer equation shows us.
So an atmosphere on top of a solar-heated surface will certainly create a gentler temperature gradient away from the surface than in the no-atmo case. Because the atmosphere can be warmed. Space can’t.
The point is, the atmosphere would’ve been warmed even without radiative gases in it, through other heat transfer mechanisms than radiation and the temperature gradient away from the surface would surely not become any steeper than with an atmosphere with radiative gases in it. (Lapse rate, solar surface heating and convection determines that.)
The atmosphere, however, couldn’t be adequately cooled to space without radiative gases (your so-called GHGs). Because there are no other available heat transfer mechanisms than radiation in this case.
”Now if you want to actually get substantive about this, I’m only going to say to you what I say to Postma when he spouts this sort of crap — show me the equations that prove that the second law is violated by a perfectly valid statement of the first law of thermodynamics that leads one to the 1.19T_0 conclusion.”
You’re not this stupid, Robert. You’re only trying to redirect from your blatant violation of both the 1st and 2nd Laws in one go. You expect ‘back radiation’ from the cooler atmosphere to heat the warmer surface that already emitted the very same energy as a thermal loss, after it first warmed the atmosphere.
This is how this whole thing started. You can’t just split a heat flow into two separate entities and then pretend that they can be treated as individual heat fluxes. The quantum world doesn’t work in a ‘macroscopic’ fashion like that. The postulated ‘downwelling radiation’ from the atmosphere is not a separate INput to the surface. It is PART OF, a COMPONENT of the spontaneous and continuous radiative HEAT transfer going from the surface to the atmosphere, the OUTput.
I told you in front of all your readers: all someone has to do is check with any field on earth where mankind causes heat to leave things. Every major field of cooling on earth relates that convective/conductive losses supercede radiant ones.
You’re actually the one coming up with some work the type of which is renowned for being filled with false information trying to turn the entire world of thermal engineering on it’s back because you want radiation to predominate in most things in the natual world.
You’re just wrong as referring to any field on earth such as architecture, or any other field:
proves everyone uses convection/conduction models worldwide to a far, far greater extent than radiant heat loss.
You’re simply pretending because people don’t even have to be shown, and can find enough sciences to check for themselves, you somehow didn’t get shown erroneous.
Willis Eschenbach says:
March 27, 2014 at 6:40 pm
Still no observations facts, citations, or quotes to support your claims? I’m totally uninterested in your further claims until you come up with evidence.
w.
Which fields do you know of that cool things via primarily radiant heat loss Willis? I’m curious what field you personally ever worked in managing heat, where the objects you cooled were cooled by a ratio of 4/5ths radiant loss.
For instance if you’re right, – 4/5ths of energy lost in nature is about 80%
the entire world is going to know you’re right.
Will I go to say, Google, or Bing, to see anybody else beside you claiming “80% of total energy losses are radiant in nature” ?
I’ll see that all over the place people are relying on the well known fact that 80% of energy lost from objects in nature is lost through radiant loss, and people will be talking about it all
over
the world.
Won’t I Willis?
Let’s check: “80% of total energy losses are radiant” search return:
Shortened for ease of handling http://goo.gl/xS8OGZ
What about Bing maybe.
Maybe not a soul who knows thermodynamics worldwide predominate 80% radiant uses Google but they all do, over on Bing. http://goo.gl/BBjZaT
Nothing there.
There’s nothing there because you’re bluffing Willis, there is no wide and credible field which teaches most losses from objects in the atmosphere – 80% – are radiant.
The fact is you’re the one who’s in here trying to chide professionals in a field you’ve never worked in about falsehoods so easily disproven you don’t even bother trying to explain how you thought “about half the energy lost to the atmosphere is through evaporation.
In any responsible conversation where an adult human being got caught as swiftly and badly as you have, there would be a moment of reflection on the absurdity of your trying to claim 80% radiative loss to the atmosphere while you admit nearly 50% through evaporation.
You have got to be one of the poorest excuses for someone claiming to b thermodynamically literate I know around Willis.
Let’s see you indicate which fields of science agree with your bombastic 80% claim.
The one you make commenting on the post
where you got caught not knowing nearly half the losses are from evaporation.
While you claimed 80% losses from radiation.
You then provided a chart showing 353 watts from the surface claiming it proves your 400 watts claim.
You’re a disgrace to scientific discussion with this kind of intellectual dishonesty and if you tried what you do here professionally you wouldn’t last a week.
—————————-
“The evaporation of the meter of water causes Earth’s
surface to lose 83 watts per square meter, almost half
of the sunlight that reaches the surface.”
Dang, James, actual numbers. You surprise me in a good way.
Yes, I agree that somewhere around 80 W/m2 of energy is lost from the surface by evapotranspiration, and that this is around half of the solar absorbed by the surface. This agrees also with the KT energy diagram, as well as the CERES data.
Now, we know that the upwelling LW from the surface is about 400 W/m2. We derive these figures in a couple ways, both from ground measurements and from satellite measurements.