Light Bulb Back Radiation Experiment
Guest essay by Curt Wilson
In the climate blogosphere, there have been several posts recently on the basic principles of radiative physics and how they relate to heat transfer. (see yesterday’s experiment by Anthony here) These have spawned incredibly lengthy streams of arguments in the comments between those who subscribe to the mainstream, or textbook view of radiative heat transfer, and those, notably the “Skydragon Slayers” who reject this view.
A typical statement from a Slayer is that if “you have initially a body kept at a certain temperature by its internal source of energy”, that if another body at a lower temperature is placed near to it, that the radiation from this colder body could not increase the temperature of the warmer body, this being a violation of the 2nd Law of Thermodynamics. They continue that if this were possible, both objects would continually increase the other’s temperature indefinitely, which would be an obvious violation of the 1st Law of Thermodynamics (energy conservation).
This is part of a more general claim by Slayers that radiation from a colder body cannot transfer any energy to a warm body and lead to a higher temperature of the warm body than would be the case without the presence of the colder body.
It occurred to me that these claims were amenable to simple laboratory experiments that I had the resources to perform. A light bulb is a classic example of a body with an internal source of energy. Several Slayers specifically used the example of reflection back to a light bulb as such an example.
In our laboratory, we often have to do thermal testing of our electronic products so we can ensure their reliability. Particularly when it comes to power electronics, we must consider the conductive, convective, and radiative heat transfer mechanisms by which heat can be removed from these bodies with an “internal source of energy”. We have invested in good thermocouple measurement devices, regularly calibrated by a professional service, to make the temperature measurements we need.
We often use banks of light bulbs as resistive loads in the testing of our power electronics, because it is a simple and inexpensive means to load the system and dissipate the power, and it is immediately obvious in at least a qualitative sense from looking at the bulbs whether they are dissipating power. So our lab bench already had these ready.
If you want to isolate the radiative effects, the ideal setup would be to perform experiments in a vacuum to eliminate the conductive/convective losses. However, the next best thing is to reduce and control these to keep them as much alike as possible in the different phases of the experiment.
So, on to the experiment. This first picture shows a standard 40-watt incandescent light bulb without power applied. The lead of the thermocouple measuring device is taped to the glass surface of the bulb with heat-resistant tape made for this purpose. The meter registers 23.2C. In addition, a professional-grade infrared thermometer is aimed at the bulb, showing a temperature of 72F. (I could not get it to change the units of the display to Celsius.) Note that throughout the experiment, the thermocouple measurements are the key ones.
Next, the standard North American voltage of 120 volts AC (measured as 120.2V) was applied to the bulb, which was standing in free air on a table top. The system was allowed to come to a new thermal equilibrium. At this new equilibrium, the thermocouple registered 93.5C. (The IR thermometer showed a somewhat lower 177F, but remember that its reported temperature makes assumptions about the emissivity of the object.)
Next, a clear cubic glass container about 150mm (6”) on a side, initially at the room temperature of 23 C, was placed over the bulb, and once again the system was allowed to reach a new thermal equilibrium. In this state, the thermocouple on the temperature of the bulb registers 105.5C, and the outer surface of the glass container registers 37.0C (equivalent to body temperature).
The glass container permits the large majority of the radiative energy to escape, both in the visible portion of the spectrum (obviously) and in the near infrared, as standard glass is highly transparent to wavelengths as long as 2500 nanometers (2.5 microns). However, it does inhibit the direct free convection losses, as air heated by the bulb can only rise as far as the top of the glass container. From there, it must conductively transfer to the glass, where it is conducted through the thickness of the glass, and the outside surface of the glass can transfer heat to the outside ambient atmosphere, where it can be convected away.
The next step in the experiment was to wrap an aluminum foil shell around the glass container. This shell would not permit any of the radiative energy from the bulb to pass through, and would reflect the large majority of that energy back to the inside. Once again the system was allowed to reach thermal equilibrium. In this new state, the thermocouple on the surface of the bulb registered 137.7C, and the thermocouple on the outer surface of the glass registered 69.6C. The infrared thermometer is not of much use here due to the very low emissivity (aka high reflectivity) of the foil. Interestingly, it did show higher temperatures when focused on the tape on the outside of the foil than on the foil itself.
Since adding the foil shell outside the glass container could be reducing the conductive/convective losses as well as the radiative losses, the shell was removed and the system with the glass container only was allowed to re-equilibrate at the conditions of the previous step. Then the glass container was quickly removed and the foil shell put in its place. After waiting for thermal equilibrium, the thermocouple on the surface of the bulb registered 148.2C and the thermocouple on the outside of the foil registered 46.5C. The transient response (not shown) was very interesting: the temperature increase of the bulb was much faster in this case than in the case of adding the foil shell to the outside of the glass container. Note also how low the infrared thermometer reads (84F = 29C) on the low-emissivity foil.
Further variations were then tried. A foil shell was placed inside the same glass container and the system allowed to reach equilibrium. The thermocouple on the surface of the bulb registered 177.3C, the thermocouple on the outer surface of the foil registered 67.6C, and the infrared thermometer reading the outside of the glass (which has high emissivity to the wavelengths of ambient thermal radiation) reads 105F (40.6C).
Then the glass container was removed from over the foil shell and the system permitted to reach equilibrium again. The thermocouple on the surface of the bulb registered 176.3C and the thermocouple on the outside of the foil registered 50.3C.
All of the above examples used the reflected shortwave radiation from the aluminum foil. What about absorbed and re-emitted longwave radiation? To test this, a shell of black-anodized aluminum plate, 1.5mm thick, was made, of the same size as the smaller foil shell. A black-anodized surface has almost unity absorption and emissivity, both in the shortwave (visible and near infrared) and longwave (far infrared). Placing this over the bulb (without the glass container), at equilibrium, the thermocouple on the bulb registered 129.1C and the thermocouple on the outside of the black shell registered 47.0C. The infrared thermometer read 122F (50C) on the tape on the outside of the shell.
The power source for this experiment was the electrical input. The wall voltage from the electrical grid was steady at 120.2 volts. The electrical current was measured under several conditions with a professional-grade clip-on current sensor. With the bulb in open air and a surface temperature of 96.0C, the bulb used 289.4 milli-amperes of current.
With the bulb covered by a foil shell alone and a surface temperature of 158.6C, the bulb drew slightly less, 288.7 milliamperes.
Summary of Results
The following table shows the temperatures at equilibrium for each of the test conditions:
| Condition | Bulb Surface Temperature | Shell Temperature |
| Bulb open to room ambient | 95C | — |
| Bulb covered by glass container alone | 105C | 37C |
| Bulb covered by glass container and outer reflective foil shell | 138C | 70C (glass) |
| Bulb covered by outer reflective foil shell alone | 148C | 46C (foil) |
| Bulb covered by inner reflective foil shell inside glass container | 177C | 68C (foil) |
| Bulb covered by inner reflective foil shell alone | 176C | 50C |
| Bulb covered by black-anodized aluminum shell alone | 129C | 47C |
Analysis
Having multiple configurations permits us to make interesting and informative comparisons. In all cases, there is about a 35-watt (120V x 0.289A) electrical input to the system, and thermal equilibrium is reached when the system is dissipating 35 watts to the room as well.
I used a low-wattage (40W nominal) bulb because I had high confidence that it could take significant temperature increases without failure, as it has the same package design as much higher-wattage bulbs. Also, I would not be working with contraband high-wattage devices 😉
The case with the glass container alone is the important reference case. The glass lets virtually all of the radiant energy through, while inhibiting direct convection to the room ambient temperature of 23C. Conductive/convective losses must pass from the surface of the bulb, through the air under the container, to and through the glass, and then to the room atmosphere, where it is conducted/convected away. Under these conditions, the bulb surface temperature is 105C, which is 10C greater than when the bulb can conductively dissipate heat directly to the room atmosphere.
Compare this case to the case of the larger foil shell alone. The foil shell also inhibits direct conductive/convective losses to the room atmosphere, but it will not inhibit them to any greater extent. In fact, there are three reasons why it will inhibit these losses less than the glass container will. First, the material thermal conductivity of aluminum metal is far higher than that of glass, over 200 times greater (>200 W/(m*K) versus <1.0 W/(m*K)). Second, the foil, which is a small fraction of a millimeter thick, is far thinner than the glass container, which is about 4 mm thick on average. And third, the surface area of the foil is somewhat larger than the glass container, so it has more ability to conductively transfer heat to the outside air.
And yet, the surface of the bulb equilibrated at 146C under these conditions, over 40C hotter than with the glass container. With conductive/convective losses no less than with the glass container, and very probably greater, the only explanation for the higher temperature can be a difference in the radiative transfer. The glass container lets the large majority of the radiation from the bulb through, and the foil lets virtually none of it through, reflecting it back toward the bulb. The presence of the foil, which started at the room ambient of 23C and equilibrated at 46C, increased the temperature of the bulb, which started at 105C on the outside (and obviously warmer inside). The reflected radiation increased the temperature of the bulb, but did not produce “endless warming”, instead simply until the other losses that increase with temperature matched the input power of 35 watts.
Interestingly, the foil shell without the glass container inside led to a higher bulb temperature (148C) than the foil shell with the glass container inside (138C). Two layers of material around the bulb must reduce conductive/convective losses more than only one of them would, so the higher temperature must result from significantly more reflected radiation back to the bulb. With the glass inside, the reflected radiation must pass through two surfaces of the glass on the way back to the bulb, neither of which passes 100% through.
Another interesting comparison is the large foil shell that could fit outside of the glass container, about 160mm on a side, with the small foil shell that could fit inside the glass container, about 140mm on a side. With the large shell alone, the bulb temperature steadied at 148C; with the smaller shell, it steadied at 176C. With all direct radiative losses suppressed in both cases, the difference must come from the reduced surface area of the smaller shell, which lessens its conductive/convective transfer to the outside air at a given temperature difference. This is why halogen incandescent light bulbs, which are designed to run hotter than standard incandescent bulbs, are so much smaller for the same power level – they need to reduce conductive/convective losses to get the higher temperatures.
All of the above-discussed setups used directly reflected radiation from the aluminum foil. What happens when there is a barrier that absorbs this “shortwave” radiation and re-emits it as “longwave” radiation in the far infrared? Can this lead to higher temperatures of the warmer body? I could test this using black-anodized aluminum plate. Black anodizing a metal surface makes it very close to the perfect “blackbody” in the visible, near-infrared, and far-infrared ranges, with absorptivity/emissivity (which are the same at any given wavelength) around 97-98% in all of these ranges.
With a black plate shell of the same size as the smaller foil shell, the bulb surface temperature equilibrated at 129C, 24C hotter than with the glass container alone. Once again, the thin metal shell would inhibit conductive/convective losses no better, and likely worse than the glass container (because of higher material conductivity and lower thickness), so the difference must be from the radiative exchange. The presence of the shell, which started at the room ambient of 23C and increased to 47C, caused the bulb surface temperature to increase from 105C to 129C.
Another interesting comparison is that of the smaller foil shell, which led to a bulb surface temperature of 176C and a shell temperature of 50C, to the black plate shell of the same size, which led to a bulb surface temperature of 129C and a shell temperature of 46C. While both of these create significantly higher bulb temperatures than the glass container, the reflective foil leads to a bulb surface temperature almost 50C higher than the black plate does. Why is this?
Consider the outside surface of the shell. The foil, which is an almost perfect reflector, has virtually zero radiative absorptivity, and therefore virtually zero radiative emissivity. So it can only transfer heat to the external room by conduction to the air, and subsequent convection away. The black plate, on the other hand, is virtually the perfect absorber and therefore radiator, so it can dissipate a lot of power to the room radiatively as well as conductively/convectively. Remember that, since it is radiating as a function of its own temperature, it will be radiating essentially equally from both sides, there being almost no temperature difference across the thickness of the plate. (Many faulty analyses miss this.) The foil simply reflects the bulb’s radiation back to the inside and radiates almost nothing to the outside. This is why the infrared thermometer does not read the temperature of the foil well.
The electrical voltage and current measurements were made to confirm that the increased temperature did not come from a higher electrical power input. The current measurements shown above demonstrate that the current draw of the bulb was no higher when the bulb temperature was higher, and was in fact slightly lower. This is to be expected, since the resistivity of the tungsten in the filament, as with any metal, increases with temperature. If you measure the resistance of an incandescent bulb at room temperature, this resistance is less than 10% of the resistance at its operating temperature. In this case, the “cold” resistance of the bulb is about 30 ohms, and the operating resistance is about 415 ohms.
Let’s look at the dynamic case, starting with the thermal equilibrium under the glass container alone. 35 watts are coming into the bulb from the electrical system, and 35 watts are leaving the bulb through conductive losses to the air and radiative losses to the room through the glass. Now we replace the glass with one of the metal shells. Conductive losses are not decreased (and may well be increased). But now the bulb is receiving radiant power from the metal shell, whether reflected in one case, or absorbed and re-radiated back at longer wavelengths in the other. Now the power into the bulb exceeds the power out, so the temperature starts to increase. (If you want to think in terms of net radiative exchange between the bulb and the shell, this net radiative output from the bulb decreases, and you get the same power imbalance.)
As the temperature of the bulb increases, both the conductive losses to the air at the surface of the bulb increase (approximately proportional to the temperature increase) and the radiative losses increase as well (approximately proportional to the 4th power of the temperature increase). Eventually, these losses increase to where the losses once again match the input power, and a new, higher-temperature thermal equilibrium is reached.
I originally did these tests employing a cylindrical glass container 150mm in diameter and 150mm high with and without foil shells, and got comparable results. In the second round shown here, I changed to a cubic container, so I could also create a black-plate shell of the same shape.
It is certainly possible that improvements to these experiments could result in differences of 1 or 2C in the results, but I don’t see any way that they could wipe out the gross effect of the warming from the “back radiation”, which are several tens of degrees C.
All of these results are completely in line with the principles taught in undergraduate engineering thermodynamics and heat transfer courses. The idea that you could inhibit net thermal losses from an object with an internal power source, whether by conductive, convective, or radiative means, without increasing the temperature of that object, would be considered ludicrous in any of these courses. As the engineers and physicists in my group came by the lab bench to see what I was up to, not a single one thought for a moment that this back radiation would not increase the temperature of the bulb.
Generations of engineers have been taught in these principles of thermal analysis, and have gone on to design crucial devices and infrastructure using these principles. If you think all of this is fundamentally wrong, you should not be spending your time arguing on blogs; you should be out doing whatever it takes to shut down all of the erroneously designed, and therefore dangerous, industrial systems that use high temperatures.
Conclusions
This experiment permitted the examination of various radiative transfer setups while controlling for conductive/convective losses from the bulb. While conductive/convective losses were not eliminated, they were at least as great, and probably greater, in the cases where a metal shell replaced the glass shell over the bulb.
Yet the bulb surface temperature was significantly higher with each of the metal shells than with the glass shell. The only explanation can therefore be the radiative transfer from the shells back to the bulb. In both cases, the shells were significantly cooler than the bulb throughout the entire experiment, both in the transient and equilibrium conditions.
We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for. This is true both for reflected radiation of the same wavelengths the warmer body emitted, and for absorbed and re-radiated emissions of longer wavelengths. The temperature effects are so large that they cannot be explained by minor setup effects.
Electrical measurements were made to confirm that there was not increased electrical power into the bulb when it was at higher temperatures. In fact, the electrical power input was slightly reduced at higher temperatures.
This experiment is therefore compatible with the standard radiative physics paradigm that warmer and cooler bodies can exchange radiative power (but the warmer body will always transfer more power to the cooler body). It is not compatible with the idea that cooler bodies cannot transfer any power by radiative means to warmer bodies and cause an increase in temperature of the warmer body.
=====================================
UPDATE: The Principia/Slayers group has post a hilarious rebuttal here:
http://principia-scientific.org/supportnews/latest-news/210-why-did-anthony-watts-pull-a-bait-and-switch.html
Per my suggestion, they have also enabled comments. You can go discuss it all there. – Anthony
In radiative terms, surely the smaller inner foil on its own, and the larger outer foil on its own, are very similar. Yet in the table there is a difference of 28C.
Bulb covered by inner reflective foil shell alone: 176C
Bulb covered by outer reflective foil shell alone: 148C
If these temperature differences are due to radiative effects, not convective/conductive ones, how is this difference to be explained? Obviously the smaller size of the inner shell has a significant impact on the temperature, probably because it restricts convection more due to the smaller volume.
The interaction of photons with matter.
Seems relevant.
Paul Clark, you say, “If these temperature differences are due to radiative effects, not convective/conductive ones, how is this difference to be explained? Obviously the smaller size of the inner shell has a significant impact on the temperature, probably because it restricts convection more due to the smaller volume.”
I agree with you on this one. In my analysis, I stated, “With all direct radiative losses suppressed in both cases, the difference must come from the reduced surface area of the smaller shell, which lessens its conductive/convective transfer to the outside air at a given temperature difference. ”
You ask a very good question about how many layers of foil there were. Over most of the surface, just one, but there were a few areas where there were two or even three – I did not spend much time on the construction. If I get the chance, I will build one more carefully with minimal overlap. Thanks for pointing that out.
[snip – this comment is too stupid to publish, about what you’d expect from “somebody” – mod]
Convection is not free in this experiment. Because of this the air between the bulb and the glass jar acts as an insulator.
Charles Gerard Nelson says:
May 28, 2013 at 11:55 pm
Convection is not free in this experiment. Because of this the air between the bulb and the glass jar acts as an insulator
Indeed, but in all cases the convection was blocked and the insulation was practically identical. So the different outcomes of the experiments are not caused by differences in convection…
A far simpler experiment showing essentially the same thing simply requires a 100mm square of aluminium foil. Pick a cold still night and go outside. Hold the foil in a vertical orientation close to one cheek (not so close as to restrict convection) with the shiny side toward your face. One cheek will feel warmer. IR emitted from your skin is being reflected back to your skin, slowing its cooling rate. Simple.
AGW however remains a physical impossibility. There is no net radiative GHE on earth.
The reason the radiative GHE hypothesis fails is not due to major problems with radiative physics, but rather with fluid dynamics and gas conduction.
There is one minor problem with the radiative physics. The interface between surface of water and the atmosphere is a special case. Incident LWIR has no significant effect on the cooling rate of liquid water that is free to evaporatively cool. Given this only effects 71% of the earth’s surface, not what you might call a huge problem. It just means that surface Tav under a non radiative atmosphere would be cooler but not 33C cooler.
The critical error in the radiative GHE hypothesis is in fluid dynamics and the physics of gas conduction. Radiative gases are critical to convective circulation below the tropopause. Without these gases, gas conductively heated at the surface could still convect to altitude, however it would no longer be able to lose energy and descend. In such an atmosphere the lapse rate would disappear and the temperature profile would trend to isothermal. The temperature of such a non radiative atmosphere would be set conductively by surface Tmax, not surface Tav. A non radiative atmosphere would be far hotter than an atmosphere with radiative gases and convective circulation.
The net effect of radiative gases in our atmosphere is cooling. The cooling effect of changing CO2 concentrations from 300 to 400ppm would be immeasurably small and indistinguishable from 0C.
Joel Shore and Curt
Climate ‘Science’ publications are unique in the way that the word ‘warm’ or ‘to warm’ or ‘warmer’ is used.
It is routinely used to describe a situation where an object is actually cooling.
Can you find anywhere else in the rich resource of English literature where ‘warm’ is used in that way?
That’s a rhetorical question because you cannot!
Three objects or entities are at different temperatures
A is at temperature 350K
B is at temperature 300K
C is at temperature 250K
All agree that A will lose heat to B and C
However Climate ‘Science’ will explain that if C is replaced with B then B has ‘warmed’ A.
Is there a reason for torturing language in this way?
Roy Spencer admits in his blog that he does this to ‘wind up’ the ‘slayers’.
Now John O’ Sullivan is a real pain in the ###, but I suggest that reduced heat loss is not best described by ‘warming’ or ‘heating’ the true heat source.
Anthony –
Coming into this a bit late, so you may not read this, but I hope so.
“Compare this case to the case of the larger foil shell alone. The foil shell also inhibits direct conductive/convective losses to the room atmosphere, but it will not inhibit them to any greater extent. In fact, there are three reasons why it will inhibit these losses less than the glass container will. First, the material thermal conductivity of aluminum metal is far higher than that of glass, over 200 times greater (>200 W/(m*K) versus <1.0 W/(m*K)). Second, the foil, which is a small fraction of a millimeter thick, is far thinner than the glass container, which is about 4 mm thick on average. And third, the surface area of the foil is somewhat larger than the glass container, so it has more ability to conductively transfer heat to the outside air."
There is fourth factor. That low emissivity of the foil that you mentioned earlier is important. That heat that isn't emitted is being contained! Your experiment is very similar to my own real world experience when working in R&D with heated molds for hot runner co-injection blow molding. We had 425°F hot cores (male part of the molds) that were highly polished, and attempts to measure their temperatures with IR sensors were unsuccessful, just as you experienced. Our cores, however, were much thicker, made of solid bars of a stainless steel high-strength alloy.
The really weird thing, for one’s brain was this:
We could not feel any of the 425°F heat coming off the surface of the cores, even when we got well within a millimeter. It was even something we would show off to people. The temps were essentially shop-floor ambient, even that close to the surface. It was amazing to play with, to see how close we could get and not touch it – and no matter HOW close we got, we could feel no heat. But when we actually TOUCHED them, of course, we got burned and burned pretty badly. *** There was essentially no conductivity or convectivity.
Why? We were not sure, but we concluded that the highly polished surface of the cores were containing the heat, as if it were being reflected back inside. hy did we conclude that?Because other surfaces nearby with less polish emitted/radiated and convected just fine.
But the emissivity was not constrained to only radiating the heat. The heat transfer through radiation, through convection, and through conduction to the air literally was stopped, for all intents and purposes. Whatever heat was getting out was very, very little, below our ability to feel it no matter how close we got, as long as we didn’t touch it.
So, your statement, “The foil shell also inhibits direct conductive/convective losses to the room atmosphere, but it will not inhibit them to any greater extent.,” is not correct, based on my own extensive experience with polished/shiny metal surfaces. In my experience that shiny metallic surface DOES inhibit conduction and convection, as well as radiation/emissivity.
Therefore the foil does contain more heat within the enclosure than the glass, because convection and conduction don’t happen (or is vastly lower). I don’t agree with your conclusion at all.
Try the experiment again and move a thermometer within close proximity to the foil and compare to doing it with the glass. You should see much less temperature variations (and cooler) with the foil than with the glass.
Steve Garcia
*** This was in the late 1980s, just before CFCs were banned, including freon. When we burned ourselves on the cores – because dammit we couldn’t even tell when we were getting too close, we couldn’t feel the heat of the cores – we had spray cans of freon that we would use. It was absolutely the best thing ever for treating burns. One spray and the pain would go away, plus the burns never blistered, though the outer skin separated from the lower dermis. No other treatment was ever necessary, other than a 1-2 second spray of freon. When they banned freon I nearly went ballistic. No one even considered this application of freon when they banned it. And to boot, I completely disagreed with the conclusion that CFCs were destroying the ozone hole above the South Pole. The logic was flawed right from the start. There was a variety of reasons that I thought that, but I won’t go into now.
I have not read all the comments so apologies if I say something that has already been said, but I am tempted to agree with Michael Moon when he says ( May 28, 2013 at 8:52 am) ” The Temperature of the Filament is the only temperature that matters here. Did the Filament warm from back-radiation?”
As I see matters, the flaw in this experiment is that it is not really addressing the crux of Joseph’s claim.
What we have here is the behavoir of glass bulb which is intially at room temperature but which then heats up when the electric bulb is switched on. The glass bulb is being heated by photons which come from a source at about 4000K. So it is no surprise that the glass bulb heats up tp 368K. When convection is hindered the heat loss from the glass bulb is restricted so it then obtains a new equalibrium temperature of 378K
Now instead of bringing a second object which has its own internal energy source and which is radiating photons at say 300K close to the proximity of the glass bulb to see whether photons radiating from an object which has a temperature of about 300K can warm an object which is at 368K, alternatively at 378K, instead the experimentor reflects back photons at about 4000K (ie, the same temperature as the filament source). I would suggest that one should get an object which only radiates and have that object at say 1 degC above the ambient room temperature and then pladce taht object near to the elctric bulb and see what effect it has on the temperature of the glass bulb. This object can be placed say 9 inches away, then 8 inches away, then 7 inches away etc until it is only 1 inch away.
Surely we all know that one can collect and focuse high energy photons from the sun. One can do this with an appropriately curved mirror. It can heat an object placed at its focal point. If we add another similar mirror, we get more power. We see this in some of the solar energy arrays which may have hundreds of mirrors. Each one is collecting some of the power and directing the power collected to a focal point such that the energy at the focal point is increased by the more mirrors that are collecting and focusing the power.
This experiment appears to be doing little more than that.
What we need to see is the effect of photons eminating from a completely different source which source is cooler if one is to refute the claim by Joseph. Of course, Joseph needs to explain what is wrong with the experiment, and it would be useful if he would conduct an experiment which establishes what he claims to be the case.
Postma says the idea that putting a highly absorptive/emissive shell around a body with an internal power source could increase the temperature of that body.
Went to bed last night, freezing , wrapped the duvet around me the internal heat source, didn’t warm me at all so had to put my undersheet electric blanket on.
“Michael Tremblay says: May 28, 2013 at 10:44 am
Your conclusion is fundamentally flawed. The statement: “We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for.” is false. What you have is solid experimental evidence that reducing the amount of energy (heat) leaving a system while maintaining the amount of energy (heat) entering the system will result in a temperature rise in the system. The temperature rise is due to the increasing amount of heat in the system and this heat is coming from the electric current being supplied to the bulb, not from the back radiation.”
This seems one of the very few rational comments here: “The temperature of a radiative body is modified by its environment. This is exactly what is violently rejected by any warmist who will claim that the increase of temperature can be nothing but the result of an increase of the power received by the bulb. In the case of the experiment above, warmists pretend that the light bulb in the blackbody enclosure must receive an extra 20W of power from the black enclosure. And from where did this power come from? From the bulb itself, they say.
In the real world, those 20W of extra “back-radiation” power are nowhere to be found.
As usual, this topic has generated far more heat than light (pun fully intended). As far as I can see, it will continue to do so forever until Slayer and Antislayer sit down together and design an experiment satisfactory to both sides – with complete agreement on what to measure, the correct interpretation of those measurements, and so on. Until then, my position remains that neither side has actually shown anything – certainly not in terms recognisable / acceptable to the other side – and that meanwhile perhaps it’s time I got round to re-reading my aging copy of “Engineering Thermodynamics, S.I. Units” by Rogers and Mayhew (Longmans, 2nd ed, 1969 reprint).
Ray Boorman says:
May 28, 2013 at 10:24 pm
Kurt, might not a better experiment be to have 2 light bulbs, say 100W & 25W. Turn on the 100W & find its equilibrium. Then position the 25W nearby, turn it on & measure the changes in the 100W bulb. I would also suggest that you should position the bulbs at least a few feet away from any other surface. This method would eliminate the greenhouse effect you cause by enclosing the bulb as you did in your experiment, & also minimise any other sources of reflected energy.
[Reply: actually, Anthony tells me he has an experiment already in the works like that – mod]
///////////////////////////////////////////////
I don’t think that works..
I consider that you require not a second light bulb (whose filiament will be running far hotter than the glass bulb of the primary bulb), but rather you need some form of rock, perhaps made of soapstone, or the like, which has high heat retention and is a good radiator.
You keep your 40 watt light bulb as is. You put the rock (soapstone) in an oven say at 30 to 50degC and heat it up until it is about 40degC, so that it is warmer than ambient room air temperature but not by so much. This rock will then radiate energy.
You switch on your 40watt light bulb as before and let it reach its equalibrium. You then introduce the rock/soapstone close (but not too close) to the 40watt light bulb so that the glass bulb of 40 light bulb receives not only the energy of the photons from its filiament but it additionally receives the photons from the rock/soapstone and then see what effect this has on the temperature of the glass bulb.
Up at Castle Howard they have a great heating system. Pipes run through the lake in the grounds back to the house. Even though the lake is much colder than the house there is heat to be extracted, to do this you need a Geothermal or ground source heat pump.
if I ran just cold pipes from the lake through the house I imagine it would take away the heat.
MODERATORS
I note that you are moderating my latest comment timed at 2:29hrs. I have slightly reviesed my comment as set out below. Please post the revised version once approved. Thanks.
Ray Boorman says:
May 28, 2013 at 10:24 pm
Kurt, might not a better experiment be to have 2 light bulbs, say 100W & 25W. Turn on the 100W & find its equilibrium. Then position the 25W nearby, turn it on & measure the changes in the 100W bulb. I would also suggest that you should position the bulbs at least a few feet away from any other surface. This method would eliminate the greenhouse effect you cause by enclosing the bulb as you did in your experiment, & also minimise any other sources of reflected energy.
[Reply: actually, Anthony tells me he has an experiment already in the works like that – mod]
///////////////////////////////////////////////
I don’t think that works. The key issue here is not the effect of back-radiation but rather the effect of radiation. Back means nothing more than the direction.
I consider that you require not a second light bulb (whose filament will be running far hotter than the glass bulb of the primary bulb and will therefore be supplying high energy photons to the glass of the primary bulb), but rather you need some form of rock, perhaps made of soapstone, or the like, which has high heat retention and is a good radiator. What we need to see is the effect of low energy radiation.
You keep your 40 watt light bulb as is. You put the rock (soapstone) in an oven say at 30 to 50degC and heat it up until it is about 40degC, so that it is warmer than ambient room air temperature but not by so much. This rock will then radiate energy (low energy photons).
You switch on your 40watt light bulb as before and let it reach its equilibrium temperature (presumably around 95degc as before). You then introduce the rock/soapstone close (but not too close) to the 40watt light bulb so that the glass bulb of the 40watt light bulb receives not only the high energy of the photons from its filament but it additionally receives the low energy photons from the rock/soapstone and then see what effect this has on the temperature of the glass bulb. One should also measure the surface temperature of the rock/soapstone.
The rock/soapstone will not lose its heat quickly due to its latent heat characteristics and because it is not that much hotter than ambient room temperature.
now if the lake was the same temp as the house you would still need a geothermal heat pump to extract the difference in temp to make the house hotter, now put that lake in the sky same temps as down at the surface,, you would need a heat transfer pump to make the surface hotter.
(Th^4 – Tc^4)
I find it unbelievable that someone can claim that increasing the temperature of the cooler body cannot warm the warmer body, by reducing the radiative cooling rate of the warmer body. It’s in the plain sight. Those who claim that should go find a heat transfer textbook and solve all the examples in it. Then, they can discuss the matter – before that they are just a distraction.
Of course, this doesn’t mean that the so-called GHGs warm the Earth’s surface. On the face of it, they cool the atmosphere and therefore the surface.
I would REALLY like to see what the profile from the side–perpendicular to the mirror placement–would be.
One side of the bulb should be warmer than the other side, shouldn’t it?
GHGs aside, all that has been shown with the experiment is that the gas in the chamber has increased in temperature. This hotter gas has heated the globe externally, as would be expected There is no measurement of the heat source (filament). I admire your efforts, unfortunately they were misdirected. You measured the wrong thing.
“I find it unbelievable that someone can claim that increasing the temperature of the cooler body cannot warm the warmer body, by reducing the radiative cooling rate of the warmer body”
is this used in industry to extract more heat out of a cooler body, sounds wonderfully efficient.
or is it called insulation.
Still the wrong conclusions. Yes the bulb envelope increased its temperature but did the element, ie., the thing that produces the heat and light, increase in temperature? this would still be producing the same temperature, a current change of 1mA would not make a measurable difference and might be a system variation, clip on ammeters are not that accurate, since they work by induction and are normally used for industrial purposes not to measure micro currents in a laboratory. (sensitive but not highly accurate).
richard verney says: May 29, 2013 at 2:55 am
see experiment here
http://www.climateandstuff.blogspot.co.uk/2013/05/back-radiation-early-results-no-fan.html
“Covering the bulb with foil has no effect on the filament temperature. The concept of heating by back-radiation is not supported”
This is one that can be tested at home so simply that it is almost unbelievable that the writer hasn’t tried it.
Get two similar bulbs, same power, same supplier, preferably a highish rating, which will make the test so much faster. If you can get 100W incandescants they’d be great.
Cover one tightly in a single layer of foil (small overlaps won’t matter). Make sure it’s a nice tight fit, so that the foil will be at almost the same temperature as the glass of the bulb.
Then turn on both lights, and wait.
See if you can guess which filament overheats and burns out in no time at all?
And why would that be, do you think? Both filaments will be drawing the same power, have the same ability to lose energy via convection and/or conduction. Only radiation is being blocked.