Slaying the 'Slayers' with Watts – part 2

Light Bulb Back Radiation Experiment

Guest essay by Curt Wilson

In the climate blogosphere, there have been several posts recently on the basic principles of radiative physics and how they relate to heat transfer. (see yesterday’s experiment by Anthony here) These have spawned incredibly lengthy streams of arguments in the comments between those who subscribe to the mainstream, or textbook view of radiative heat transfer, and those, notably the “Skydragon Slayers” who reject this view.

A typical statement from a Slayer is that if “you have initially a body kept at a certain temperature by its internal source of energy”, that if another body at a lower temperature is placed near to it, that the radiation from this colder body could not increase the temperature of the warmer body, this being a violation of the 2nd Law of Thermodynamics. They continue that if this were possible, both objects would continually increase the other’s temperature indefinitely, which would be an obvious violation of the 1st Law of Thermodynamics (energy conservation).

This is part of a more general claim by Slayers that radiation from a colder body cannot transfer any energy to a warm body and lead to a higher temperature of the warm body than would be the case without the presence of the colder body.

It occurred to me that these claims were amenable to simple laboratory experiments that I had the resources to perform. A light bulb is a classic example of a body with an internal source of energy. Several Slayers specifically used the example of reflection back to a light bulb as such an example.

In our laboratory, we often have to do thermal testing of our electronic products so we can ensure their reliability. Particularly when it comes to power electronics, we must consider the conductive, convective, and radiative heat transfer mechanisms by which heat can be removed from these bodies with an “internal source of energy”. We have invested in good thermocouple measurement devices, regularly calibrated by a professional service, to make the temperature measurements we need.

We often use banks of light bulbs as resistive loads in the testing of our power electronics, because it is a simple and inexpensive means to load the system and dissipate the power, and it is immediately obvious in at least a qualitative sense from looking at the bulbs whether they are dissipating power. So our lab bench already had these ready.

If you want to isolate the radiative effects, the ideal setup would be to perform experiments in a vacuum to eliminate the conductive/convective losses. However, the next best thing is to reduce and control these to keep them as much alike as possible in the different phases of the experiment.

So, on to the experiment. This first picture shows a standard 40-watt incandescent light bulb without power applied. The lead of the thermocouple measuring device is taped to the glass surface of the bulb with heat-resistant tape made for this purpose. The meter registers 23.2C. In addition, a professional-grade infrared thermometer is aimed at the bulb, showing a temperature of 72F. (I could not get it to change the units of the display to Celsius.) Note that throughout the experiment, the thermocouple measurements are the key ones.

clip_image002

Next, the standard North American voltage of 120 volts AC (measured as 120.2V) was applied to the bulb, which was standing in free air on a table top. The system was allowed to come to a new thermal equilibrium. At this new equilibrium, the thermocouple registered 93.5C. (The IR thermometer showed a somewhat lower 177F, but remember that its reported temperature makes assumptions about the emissivity of the object.)

clip_image004

Next, a clear cubic glass container about 150mm (6”) on a side, initially at the room temperature of 23 C, was placed over the bulb, and once again the system was allowed to reach a new thermal equilibrium. In this state, the thermocouple on the temperature of the bulb registers 105.5C, and the outer surface of the glass container registers 37.0C (equivalent to body temperature).

The glass container permits the large majority of the radiative energy to escape, both in the visible portion of the spectrum (obviously) and in the near infrared, as standard glass is highly transparent to wavelengths as long as 2500 nanometers (2.5 microns). However, it does inhibit the direct free convection losses, as air heated by the bulb can only rise as far as the top of the glass container. From there, it must conductively transfer to the glass, where it is conducted through the thickness of the glass, and the outside surface of the glass can transfer heat to the outside ambient atmosphere, where it can be convected away.

clip_image006

The next step in the experiment was to wrap an aluminum foil shell around the glass container. This shell would not permit any of the radiative energy from the bulb to pass through, and would reflect the large majority of that energy back to the inside. Once again the system was allowed to reach thermal equilibrium. In this new state, the thermocouple on the surface of the bulb registered 137.7C, and the thermocouple on the outer surface of the glass registered 69.6C. The infrared thermometer is not of much use here due to the very low emissivity (aka high reflectivity) of the foil. Interestingly, it did show higher temperatures when focused on the tape on the outside of the foil than on the foil itself.

clip_image008

Since adding the foil shell outside the glass container could be reducing the conductive/convective losses as well as the radiative losses, the shell was removed and the system with the glass container only was allowed to re-equilibrate at the conditions of the previous step. Then the glass container was quickly removed and the foil shell put in its place. After waiting for thermal equilibrium, the thermocouple on the surface of the bulb registered 148.2C and the thermocouple on the outside of the foil registered 46.5C. The transient response (not shown) was very interesting: the temperature increase of the bulb was much faster in this case than in the case of adding the foil shell to the outside of the glass container. Note also how low the infrared thermometer reads (84F = 29C) on the low-emissivity foil.

clip_image010

Further variations were then tried. A foil shell was placed inside the same glass container and the system allowed to reach equilibrium. The thermocouple on the surface of the bulb registered 177.3C, the thermocouple on the outer surface of the foil registered 67.6C, and the infrared thermometer reading the outside of the glass (which has high emissivity to the wavelengths of ambient thermal radiation) reads 105F (40.6C).

clip_image012

Then the glass container was removed from over the foil shell and the system permitted to reach equilibrium again. The thermocouple on the surface of the bulb registered 176.3C and the thermocouple on the outside of the foil registered 50.3C.

clip_image014

All of the above examples used the reflected shortwave radiation from the aluminum foil. What about absorbed and re-emitted longwave radiation? To test this, a shell of black-anodized aluminum plate, 1.5mm thick, was made, of the same size as the smaller foil shell. A black-anodized surface has almost unity absorption and emissivity, both in the shortwave (visible and near infrared) and longwave (far infrared). Placing this over the bulb (without the glass container), at equilibrium, the thermocouple on the bulb registered 129.1C and the thermocouple on the outside of the black shell registered 47.0C. The infrared thermometer read 122F (50C) on the tape on the outside of the shell.

clip_image016

The power source for this experiment was the electrical input. The wall voltage from the electrical grid was steady at 120.2 volts. The electrical current was measured under several conditions with a professional-grade clip-on current sensor. With the bulb in open air and a surface temperature of 96.0C, the bulb used 289.4 milli-amperes of current.

clip_image018

With the bulb covered by a foil shell alone and a surface temperature of 158.6C, the bulb drew slightly less, 288.7 milliamperes.

clip_image020

Summary of Results

The following table shows the temperatures at equilibrium for each of the test conditions:

Condition Bulb Surface Temperature Shell Temperature
Bulb open to room ambient 95C
Bulb covered by glass container alone 105C 37C
Bulb covered by glass container and outer reflective foil shell 138C 70C (glass)
Bulb covered by outer reflective foil shell alone 148C 46C (foil)
Bulb covered by inner reflective foil shell inside glass container 177C 68C (foil)
Bulb covered by inner reflective foil shell alone 176C 50C
Bulb covered by black-anodized aluminum shell alone 129C 47C

Analysis

Having multiple configurations permits us to make interesting and informative comparisons. In all cases, there is about a 35-watt (120V x 0.289A) electrical input to the system, and thermal equilibrium is reached when the system is dissipating 35 watts to the room as well.

I used a low-wattage (40W nominal) bulb because I had high confidence that it could take significant temperature increases without failure, as it has the same package design as much higher-wattage bulbs. Also, I would not be working with contraband high-wattage devices 😉

The case with the glass container alone is the important reference case. The glass lets virtually all of the radiant energy through, while inhibiting direct convection to the room ambient temperature of 23C. Conductive/convective losses must pass from the surface of the bulb, through the air under the container, to and through the glass, and then to the room atmosphere, where it is conducted/convected away. Under these conditions, the bulb surface temperature is 105C, which is 10C greater than when the bulb can conductively dissipate heat directly to the room atmosphere.

Compare this case to the case of the larger foil shell alone. The foil shell also inhibits direct conductive/convective losses to the room atmosphere, but it will not inhibit them to any greater extent. In fact, there are three reasons why it will inhibit these losses less than the glass container will. First, the material thermal conductivity of aluminum metal is far higher than that of glass, over 200 times greater (>200 W/(m*K) versus <1.0 W/(m*K)). Second, the foil, which is a small fraction of a millimeter thick, is far thinner than the glass container, which is about 4 mm thick on average. And third, the surface area of the foil is somewhat larger than the glass container, so it has more ability to conductively transfer heat to the outside air.

And yet, the surface of the bulb equilibrated at 146C under these conditions, over 40C hotter than with the glass container. With conductive/convective losses no less than with the glass container, and very probably greater, the only explanation for the higher temperature can be a difference in the radiative transfer. The glass container lets the large majority of the radiation from the bulb through, and the foil lets virtually none of it through, reflecting it back toward the bulb. The presence of the foil, which started at the room ambient of 23C and equilibrated at 46C, increased the temperature of the bulb, which started at 105C on the outside (and obviously warmer inside). The reflected radiation increased the temperature of the bulb, but did not produce “endless warming”, instead simply until the other losses that increase with temperature matched the input power of 35 watts.

Interestingly, the foil shell without the glass container inside led to a higher bulb temperature (148C) than the foil shell with the glass container inside (138C). Two layers of material around the bulb must reduce conductive/convective losses more than only one of them would, so the higher temperature must result from significantly more reflected radiation back to the bulb. With the glass inside, the reflected radiation must pass through two surfaces of the glass on the way back to the bulb, neither of which passes 100% through.

Another interesting comparison is the large foil shell that could fit outside of the glass container, about 160mm on a side, with the small foil shell that could fit inside the glass container, about 140mm on a side. With the large shell alone, the bulb temperature steadied at 148C; with the smaller shell, it steadied at 176C. With all direct radiative losses suppressed in both cases, the difference must come from the reduced surface area of the smaller shell, which lessens its conductive/convective transfer to the outside air at a given temperature difference. This is why halogen incandescent light bulbs, which are designed to run hotter than standard incandescent bulbs, are so much smaller for the same power level – they need to reduce conductive/convective losses to get the higher temperatures.

All of the above-discussed setups used directly reflected radiation from the aluminum foil. What happens when there is a barrier that absorbs this “shortwave” radiation and re-emits it as “longwave” radiation in the far infrared? Can this lead to higher temperatures of the warmer body? I could test this using black-anodized aluminum plate. Black anodizing a metal surface makes it very close to the perfect “blackbody” in the visible, near-infrared, and far-infrared ranges, with absorptivity/emissivity (which are the same at any given wavelength) around 97-98% in all of these ranges.

With a black plate shell of the same size as the smaller foil shell, the bulb surface temperature equilibrated at 129C, 24C hotter than with the glass container alone. Once again, the thin metal shell would inhibit conductive/convective losses no better, and likely worse than the glass container (because of higher material conductivity and lower thickness), so the difference must be from the radiative exchange. The presence of the shell, which started at the room ambient of 23C and increased to 47C, caused the bulb surface temperature to increase from 105C to 129C.

Another interesting comparison is that of the smaller foil shell, which led to a bulb surface temperature of 176C and a shell temperature of 50C, to the black plate shell of the same size, which led to a bulb surface temperature of 129C and a shell temperature of 46C. While both of these create significantly higher bulb temperatures than the glass container, the reflective foil leads to a bulb surface temperature almost 50C higher than the black plate does. Why is this?

Consider the outside surface of the shell. The foil, which is an almost perfect reflector, has virtually zero radiative absorptivity, and therefore virtually zero radiative emissivity. So it can only transfer heat to the external room by conduction to the air, and subsequent convection away. The black plate, on the other hand, is virtually the perfect absorber and therefore radiator, so it can dissipate a lot of power to the room radiatively as well as conductively/convectively. Remember that, since it is radiating as a function of its own temperature, it will be radiating essentially equally from both sides, there being almost no temperature difference across the thickness of the plate. (Many faulty analyses miss this.) The foil simply reflects the bulb’s radiation back to the inside and radiates almost nothing to the outside. This is why the infrared thermometer does not read the temperature of the foil well.

The electrical voltage and current measurements were made to confirm that the increased temperature did not come from a higher electrical power input. The current measurements shown above demonstrate that the current draw of the bulb was no higher when the bulb temperature was higher, and was in fact slightly lower. This is to be expected, since the resistivity of the tungsten in the filament, as with any metal, increases with temperature. If you measure the resistance of an incandescent bulb at room temperature, this resistance is less than 10% of the resistance at its operating temperature. In this case, the “cold” resistance of the bulb is about 30 ohms, and the operating resistance is about 415 ohms.

Let’s look at the dynamic case, starting with the thermal equilibrium under the glass container alone. 35 watts are coming into the bulb from the electrical system, and 35 watts are leaving the bulb through conductive losses to the air and radiative losses to the room through the glass. Now we replace the glass with one of the metal shells. Conductive losses are not decreased (and may well be increased). But now the bulb is receiving radiant power from the metal shell, whether reflected in one case, or absorbed and re-radiated back at longer wavelengths in the other. Now the power into the bulb exceeds the power out, so the temperature starts to increase. (If you want to think in terms of net radiative exchange between the bulb and the shell, this net radiative output from the bulb decreases, and you get the same power imbalance.)

As the temperature of the bulb increases, both the conductive losses to the air at the surface of the bulb increase (approximately proportional to the temperature increase) and the radiative losses increase as well (approximately proportional to the 4th power of the temperature increase). Eventually, these losses increase to where the losses once again match the input power, and a new, higher-temperature thermal equilibrium is reached.

I originally did these tests employing a cylindrical glass container 150mm in diameter and 150mm high with and without foil shells, and got comparable results. In the second round shown here, I changed to a cubic container, so I could also create a black-plate shell of the same shape.

It is certainly possible that improvements to these experiments could result in differences of 1 or 2C in the results, but I don’t see any way that they could wipe out the gross effect of the warming from the “back radiation”, which are several tens of degrees C.

All of these results are completely in line with the principles taught in undergraduate engineering thermodynamics and heat transfer courses. The idea that you could inhibit net thermal losses from an object with an internal power source, whether by conductive, convective, or radiative means, without increasing the temperature of that object, would be considered ludicrous in any of these courses. As the engineers and physicists in my group came by the lab bench to see what I was up to, not a single one thought for a moment that this back radiation would not increase the temperature of the bulb.

Generations of engineers have been taught in these principles of thermal analysis, and have gone on to design crucial devices and infrastructure using these principles. If you think all of this is fundamentally wrong, you should not be spending your time arguing on blogs; you should be out doing whatever it takes to shut down all of the erroneously designed, and therefore dangerous, industrial systems that use high temperatures.

Conclusions

This experiment permitted the examination of various radiative transfer setups while controlling for conductive/convective losses from the bulb. While conductive/convective losses were not eliminated, they were at least as great, and probably greater, in the cases where a metal shell replaced the glass shell over the bulb.

Yet the bulb surface temperature was significantly higher with each of the metal shells than with the glass shell. The only explanation can therefore be the radiative transfer from the shells back to the bulb. In both cases, the shells were significantly cooler than the bulb throughout the entire experiment, both in the transient and equilibrium conditions.

We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for. This is true both for reflected radiation of the same wavelengths the warmer body emitted, and for absorbed and re-radiated emissions of longer wavelengths. The temperature effects are so large that they cannot be explained by minor setup effects.

Electrical measurements were made to confirm that there was not increased electrical power into the bulb when it was at higher temperatures. In fact, the electrical power input was slightly reduced at higher temperatures.

This experiment is therefore compatible with the standard radiative physics paradigm that warmer and cooler bodies can exchange radiative power (but the warmer body will always transfer more power to the cooler body). It is not compatible with the idea that cooler bodies cannot transfer any power by radiative means to warmer bodies and cause an increase in temperature of the warmer body.

=====================================

UPDATE: The Principia/Slayers group has post a hilarious rebuttal here:

http://principia-scientific.org/supportnews/latest-news/210-why-did-anthony-watts-pull-a-bait-and-switch.html

Per my suggestion, they have also enabled comments. You can go discuss it all there. – Anthony

The climate data they don't want you to find — free, to your inbox.
Join readers who get 5–8 new articles daily — no algorithms, no shadow bans.
0 0 votes
Article Rating
412 Comments
Inline Feedbacks
View all comments
Leonard Weinstein
May 28, 2013 12:06 pm

wikeroy says:
May 28, 2013 at 11:43 am
I have communicated many times with the slayers. You are correct that some of them agree with the process described as “slowed the cooling” ( Or Insulated it, thereby making the temperature go up). Radiation absorption by absorbing gases and radiation by radiating gases to the ground does exactly the same thing, and thus causes a warmer ground than otherwise, and this is called the atmospheric greenhouse effect. Many of the slayers deny this effect altogether. The issue of CO2 increases being significant is a separate issue, as negative feedback and natural variation probably overwhelm that small effect. This is exactly what Roy, Anthony, and I say, and is the point of the present writeup. However, be clear, it slows the cooling by absorbing back radiation, and thus decreasing net radiation heat transfer. Thus, not all slayers are wrong on all issues, but those that deny back radiation from a cooler source to a warmer source can cause increased temperature in the presence of constant supplied energy are totally wrong. Slayers such as Joseph Postma and some other totally deny the possibility of absorbing back radiation from a cooler source.

higley7
May 28, 2013 12:07 pm

Unless the light intensity is also added to this consideration, the energy flux and temperature are not meaningful. Did light intensity, which represents energy outward alter with changes in current or temperature of the filament? This experiment is far from being as simple as it first appears.
If you think about the atmosphere and the “GHGs,” IR radiation from the surface heads upward, encounters gas molecules and most often is immediately re-emitted in its original direction. Some molecules absorb the energy, and while energized, they collide with another molecule and, instead of the energy being re-emitted as IR, it becomes heat energy. Statistically this latter event is rare and thus there is little warming of the gases. Of course, the IPCC brains took the thermodynamic constant for CO2 and multiplied it by 12 to increase the effect, but even then it was still tiny. So, they called upon water vapor to be enslaved by CO2 and multiply the effect of CO2 by another order of magnitude. Of course, they also entirely deny the existence of the water cycle which is a huge heat engine which carries energy to altitude, moving perhaps as much as 85% of the energy budget, with the remainder as radiation (this is why Trenberth has so much trouble with “missing heat,” there being no water cycle in his world).
Now, any IR re-emitted by GHGs downward will encounter a warm surface whose equivalent energy levels are already filled, and thus, the RI will be absorbed as IR is emitted, being a wash, or simply reflected back upward. The result is no heating occurs. Furthermore, making this scenario even less likely to be at all detectable is the fact that the IPCC claims that this IR is redirected downward by the upper troposphere which is rather thin, at 0.25 atm, and has fairly little CO2 in terms of density per cubic meter. Sure, this is where Raleigh Scattering may occur and can be re-directed downward, but that is only one of 6 major directions and thus little IR is redirected downward. And, to make matters worse, the upper troposphere, which the IPCC says has to be warming faster than the surface, has been cooling a bit rather than warming and, in fact, the hotspot the IPCC MUST HAVE is entirely missing. This one missing feature of their model kills the whole model. It’s a must have!!!!!!!! Why do people feel free to ignore this glaring problem?

Anymoose
May 28, 2013 12:09 pm

I think what we have here is a confirmation that Ben Franklin knew what he was talking about, 250 years ago. Depending on radiation to heat air is a fool’s task.
Ben knew that standing in front of a fireplace would warm his backside to the point of discomfort, while leaving his belly cold. By placing the same fire (heat source) in a stove would heat the room to some level of comfort. Ben didn’t know about molecules and that sort of thing, but understood that the top, bottom and sides of the Franklin stove transferred heat to the air more efficiently. We understand now that those molecules bouncing off the stove, and off of each other, were being heated by contact and being carried off by convection, to eventually bring the room temperature to a more-or-less equilibrium temperature.
This also explains how the re-radiated heat energy from the earth’s surface can pass through the air molecules of the troposphere, without raising their temperature much
All of this coming from Ben Franklin’s highly calibrated backside.

Slartibartfast
May 28, 2013 12:10 pm

Not really an open system at all.

The only real open systems are hypothetical.

was what is being argued, that gasses cause a greenhouse effect

The argument here is not that gasses cause a greenhouse effect; the argument (as I understand) is that a cooler object can warm up a warm object. Which as I see it is intuitively obvious to the casual observer.

Carrick
May 28, 2013 12:12 pm

Jim, I do look at the line noise from AC line in. If you want to collect measurements with an accuracy of parts in 10^6 (I often do), then these things perhaps will matter. Even for high-end audio applications, filtering the AC matters (using balanced lines helps more though).
But for a physics demonstration where you’re just looking at qualitative changes, I don’t expect this to matter very much. Still replacing the AC line with a good DC digital power supply (where I can monitor the current draw directly) would be a nice improvement.

son of mulder
May 28, 2013 12:13 pm

” OldWeirdHarold says:
May 28, 2013 at 11:50 am
I actually did that futzing around when I was a kid. I painted a light bulb with dark red paint (don’t ask why). It got so hot it burned the bulb socket.”
I’m amazed it didn’t explode. I’ve had lightbulbs explode because they were too high a power for the light shade.

rgbatduke
May 28, 2013 12:13 pm

Go for it. Do it. And when the current drops (and it will, but possibly by a very small amount), tell me why that happened. Show all work.
Or, you could read TFA

Slartibartfast
May 28, 2013 12:15 pm

If you define line noise to include amplitude and phase variation on top of high-frequency noise, I’d say that yes, those will probably dominate here.
But if you take those away, you’re going to see apprximately the same result.

Rob
May 28, 2013 12:16 pm

Seems all one would have to do is call a bulb manufacturer , their R&D should have all the data as they manufacture a wide variety of color’s and inclosure’s .

rgbatduke
May 28, 2013 12:16 pm

where it is shown that the current does indeed drop with the foil shell and warmer temperatures, proving that the filament does, in fact, heat as its resistance increases a small amount (given the warming in degrees absolute and T^4 radiative cooling).
Sorry about the split post — mouse clicked in the middle of typing…

May 28, 2013 12:16 pm

chris y says May 28, 2013 at 11:29 am
I continue to be amazed that this remains a controversial issue.
GE R&D center developed an incandescent bulb back in the late 1980′s (and GE lighting commercialized it in the 1990′s) that placed a spherical clear glass shell around the filament.
The shell was coated with a multilayer (anywhere from 15 – 30 separate layers) optical filter that reflects mid infrared back onto the filament, while allowing visible light to pass through.
The result is that a lower filament current can achieve the same filament temperature, thanks to the mid infrared energy being reflected back to the filament. This results in a 15% – 20% increase in lumens/Watt.

Excellent post. Bears repeating. Thank you.
It appears GE suspended some of that work, according to press at the time (2010):
GE Suspends Development Of High-Efficiency Incandescent Bulbs
http://www.environmentalleader.com/2008/12/01/ge-suspends-development-of-high-efficiency-incandescent-bulbs/
A little more detail on the technology:

There were two publicly-known technologies they were working on at the time that, if improved, could raise the efficacy of incandescent lamps to the 50 to 60 lm/W range.
The first is IR reflecting films, a technology that is already in commercial use. Considering that 90% to 95% of the energy generated by an incandescent filament is radiated away as IR (depending upon where you define the long wavelength end of the visible spectrum), using IR films to raise the efficacy of incandescent lamps by a factor of 3 or even 4 is possible. Low-voltage IR-halogen filament tubes may already meet the initial goal of 30 lm/W.

From: http://sci.engr.lighting.narkive.com/Zjpr8gDa/new-ge-incandescent-lamp-technology
.

FerdiEgb
May 28, 2013 12:20 pm

Duster says:
May 28, 2013 at 11:13 am
Might consider redoing this using a clear bulb. That would eliminate the effects of the interior coating of the bulb. Otherwise, very interesting indeed.
I think the bulb used was an internally etched one, not coated. That only spreads the light in all directions, but doesn’t absorb it.

higley7
May 28, 2013 12:21 pm

Another point of view: There is no doubt that the Earth’s SURFACE is cooler for there being an atmosphere as it blocks some incoming radiation and also carries heat away by conduction and convection. BUT, and a big BUT, having an atmosphere makes the space ABOVE the surface much warmer, as the vacuum of space would be in the single digits of Kelvin. The ability of GHS to convert IR to heat in the atmosphere is pathetic and undetectable in the real world

Cho_cacao
May 28, 2013 12:23 pm

DirkH,
the discussion was about the heating bulb. Believe or not, but the planet is a slightly more complex system! 😉

rgbatduke
May 28, 2013 12:23 pm

And incidentally — well done! Too bad Joe won’t just concede that all of the times he has asserted to me personally in both WUWT threads and email threads that reflection does not increase the temperature of a heated source (while refusing to actually do the experiment properly himself or acknowledge the technology that is based on the principle) he has been just plain wrong.
“Back to the drawing board” wrong. “Start over, this time using the laws of physics” wrong. “Hard to be wronger than that” wrong. “Watch the movie” wrong. “Read the hard data points above in an easily reproducible experiment that controls for convection and conduction quite nicely” wrong. Really, truly, wrong.
Come on Joe. Time to man up. Acknowledge that you were WRONG so that we can go on to the next thing you are wrong about, then the next thing, until there is nothing left of the PSI assertions but hot air and a bad smell.
Or, you can shorten the process considerably and abandon PSI altogether, right now. Because they/you are also wrong about the Lindzen model, wrong about the supposed violation of the second law, and, in fact, wrong when they assert that there is no effect on surface temperature from clearly measurable atmospheric back radiation. All of which are perfectly obvious and empirically proven, but it’s time to acknowledge this and move on.
rgb
rgb

Just an engineer
May 28, 2013 12:26 pm

pgosselin says:
May 28, 2013 at 11:31 am
And part of the Gulf of Mexico flows up to Missouri, right?
————————————————————————————-
Actually the Gulf of Mexico IS the predominate source of Missouri rain.
So you are correct. This time.

Maxbert
May 28, 2013 12:30 pm

This experiment seems like going to a lot of trouble only to demonstrate that while you can slow the rate of net transfer, depending on the point of equilibrium, you cannot reverse it.

rgbatduke
May 28, 2013 12:31 pm

If you think about the atmosphere and the “GHGs,” IR radiation from the surface heads upward, encounters gas molecules and most often is immediately re-emitted in its original direction.
Actually, that turns out not to be the case. Not just “most often” but almost always, the gas is re-emitted in a dipolar distribution that is azimuthally symmetric around the polarization axis. For unpolarized light, this axis is itself essentially randomly distributed in the plane perpendicular to the original direction and hence reradiation is essentially spherically symmetric.
So LWIR heads up, encounters gas molecules, is absorbed (or not, depending on wavelength and chance) and if absorbed, is radiated back down at the surface just under 50% of the time.
The “just under” comes from transfer, stimulated emission while excited, and the slightly smaller solid angle of the sphere versus space at some height above the surface, usually negligible.
This is distinct from albedo. You can read Grant Perry’s book if you want to work through the details.
rgb

rgbatduke
May 28, 2013 12:35 pm

A cooler object doesn’t radiate energy to a warmer one.
I’ve got a blue-light laser here that says you are wrong. I will cheerfully direct it at your hand while holding a thermocouple against the laser. Any takers?
Any more obviously false idiocy?
rgb

Michael J. Dunn
May 28, 2013 12:36 pm

A horrendously long thread; I didn’t have time to read it all. And such confusion! Let me point out one simple matter: If either of two bodies are maintaining a surface temperature on the basis of an internal heat source being relieved by thermal radiation….there is no conservation of energy (duh!…internal heat source).
As for two bodies next to one another, they are both “hot” relative to absolute zero (the only “cold” that counts). Either will raise the other’s temperature relative to what it might have been if the other had been somewhere else alone (no conservation of energy, remember?), and more heat will be transferred to the less “hot” body than to the more “hot” body (net heat transfer from “hot” to “cold”), as we all expect from simple thermodynamics.
It is easy to show that the Earth’s temperature is essentially a function of the ratio of its integrated radiation absorption coefficient to its integrated emission coefficient. Absolute surface temperature will vary in proportion to the fourth root of this ratio. This has meant the difference between tropical (carboniferous) and frozen ages. We know our albedo to maybe a 10% error. For a nominal surface temperature of 300 K, that works out to a temperature variation of 7 C or 13 F. It doesn’t take much to result in really big shifts. The miracle is not that the surface temperature changes, but that it really doesn’t change much at all. There are powerful negative feedbacks at work, thank God!

May 28, 2013 12:36 pm

q = ε σ (Th4 – Tc4) Ac
In view of the experiment what do we do with this from engineering tool box. It clearly says that once two temperatures are equal no heat transfer takes place. But the experiment and many say that the above equation is wrong. Where is the feed back loop in the equation that allows for Tc to add to Th? rgb according to this equation a reflection cannot heat the source since Th = Tc in reflection so where to we go? What equations or series of equations do we use?

May 28, 2013 12:41 pm

Carrick says May 28, 2013 at 12:12 pm

But for a physics demonstration where …

Ahem … I’d like to resolve the current draw issue (although I think it is now moot), I have a request standing for the model number of the clip-on style meter seen used in the above demo pics … I’m interested in those numbers and not so much in editorializing. Thank you.
I also addressed issues from an instrumentation and ‘dynamics’ standpoint for the more astute reader who has curiosity to that depth (I know I have).
BTW, accuracy in parts in 10^6 are really easy today, but if that is your ‘limit’ then you can improve on that by two and three orders of magnitude easy (for frequency anyway, which is what I am most often concerned with). I maintain a pair of IC-756ProII at around 10^7 (short term), and this is by “zero beat” (using human ear) with the NIST over-the-air SW reference station called “WWV”.
Replacing the AC with DC allows for low-cost instrumentation to be used and negates any ‘dynamics’ effects that can cloud (pardon the pun) or mask issues. This should remind many here who might have stumbled onto one or two (maybe more) YouTube videos put out by the ‘free energy’ crowd who use ‘static’ instrumentation (which measures/records/reports peak values but displays reports RMS-corresponding (normally for AC sinusoids) or ignores peaks altogether (most DMMs)) to observe obvious pulse phenomenon as they attempt to ‘extricate’ energy from magnets and pulsed coils/inductors, ignoring the ‘flyback’ effect as the built-up mag field collapses when the device’s ‘switch’ removes the periodicly voltage or current source. The so-called “Rosemary Ainslie circuit” being an example with widespread coverage and publicity …
.

Barry Cullen
May 28, 2013 12:42 pm

Haven’t read all the comments but consider this; one of the “bulb” manufacturers, don’t remember which one, had been producing tungsten/halide bulbs for many years that have a coating of IR reflector (multi-layer interference type) on the clear, high silica envelope. The W filament is supported in the center line of the bulb. The result is higher lumens/watt (~32-33) than without the coating (<30).
So, the cooler envelope coating IS heating the filament with IR that normally would escape. The increase in temperature is tiny (~53°K at 3000°K) for the 10% increased output because the radiation increases by T^4 and that bulb envelope temp is certainly way, way, way below 3000°K!
BTW – the T^4 relationship is likely why, in addition to or conjunction with Willis' tropical thunderstorm governor mechanism the temperature of the Earth is basically bi-modal, as determined by the long term EM input, like Momma and Baby bears porridge's but never Papa bears.
BC

joeldshore
May 28, 2013 12:44 pm

Slartibartfast says:

Look, I am beginning to suspect that y’all are equating energy transfer with net energy transfer. Sure, the cooler object receives more energy from the warmer one than flows in the opposite direction. What of it? The warmer object is still getting more radiation than it would from e.g. intergalactic space.

Exactly. What people need to understand is that at the time that the 2nd Law of Thermodynamics was formulated, the only thing that could really be considered was the net macroscopic energy flow. So, that is what “heat” is. We now understand this fundamentally time-irreversible behavior at the macroscopic level as arising from fundamentally time-reversible processes at the microscopic level, as described by statistical mechanics. (Thing of “time reversibility” as referring to whether a movie of the situation makes sense when run in reverse. For example, the collision of two molecules would be time-reversible: A movie in reverse represents a perfectly reasonable collision. However, the spontaneous flow of heat from hot to cold or the slowing down of a block sliding along a table due to friction is not time-reversible: We would immediately recognize the movie in reverse as showing something that we do not observe in nature.)
What “Slayers” are doing is applying macroscopic laws to microscopic behavior. I would say that they were stuck in the middle of the 19th century in their thinking, but this is unfair to the scientists of that time period because these scientists didn’t make this fundamental mistake. They merely describes the only world that they had the capability to observe at that time, which was the macroscopic world.
The whole point of the modern understanding of thermodynamics is that the 2nd Law does not arise because of arbitrary and capricious laws like saying that colder objects don’t radiate toward warmer objects (or the warmer objects absorb the radiation from colder objects). Rather, it is a consequence of simple facts such as that the amount of radiation from a warmer object that is absorbed by a colder object is always greater than the amount of radiation from a colder object that is absorbed by a warmer object, a consequence guaranteed by the basic laws governing such radiation (i.e., that radiation emitted is an increasing function of temperature and Kirchhoff’s Radiation Law relating the emissivity and absorptivity of an object at a given wavelength).

May 28, 2013 12:45 pm

It clearly says that once two temperatures are equal no heat transfer takes place.

No, once to items are at the same temperature, the *net* heat transfer between them drops to 0. If they are not isolated systems, the equilibrium will be the product of all the heat flows between them canceling out to 0.

1 3 4 5 6 7 17