Light Bulb Back Radiation Experiment
Guest essay by Curt Wilson
In the climate blogosphere, there have been several posts recently on the basic principles of radiative physics and how they relate to heat transfer. (see yesterday’s experiment by Anthony here) These have spawned incredibly lengthy streams of arguments in the comments between those who subscribe to the mainstream, or textbook view of radiative heat transfer, and those, notably the “Skydragon Slayers” who reject this view.
A typical statement from a Slayer is that if “you have initially a body kept at a certain temperature by its internal source of energy”, that if another body at a lower temperature is placed near to it, that the radiation from this colder body could not increase the temperature of the warmer body, this being a violation of the 2nd Law of Thermodynamics. They continue that if this were possible, both objects would continually increase the other’s temperature indefinitely, which would be an obvious violation of the 1st Law of Thermodynamics (energy conservation).
This is part of a more general claim by Slayers that radiation from a colder body cannot transfer any energy to a warm body and lead to a higher temperature of the warm body than would be the case without the presence of the colder body.
It occurred to me that these claims were amenable to simple laboratory experiments that I had the resources to perform. A light bulb is a classic example of a body with an internal source of energy. Several Slayers specifically used the example of reflection back to a light bulb as such an example.
In our laboratory, we often have to do thermal testing of our electronic products so we can ensure their reliability. Particularly when it comes to power electronics, we must consider the conductive, convective, and radiative heat transfer mechanisms by which heat can be removed from these bodies with an “internal source of energy”. We have invested in good thermocouple measurement devices, regularly calibrated by a professional service, to make the temperature measurements we need.
We often use banks of light bulbs as resistive loads in the testing of our power electronics, because it is a simple and inexpensive means to load the system and dissipate the power, and it is immediately obvious in at least a qualitative sense from looking at the bulbs whether they are dissipating power. So our lab bench already had these ready.
If you want to isolate the radiative effects, the ideal setup would be to perform experiments in a vacuum to eliminate the conductive/convective losses. However, the next best thing is to reduce and control these to keep them as much alike as possible in the different phases of the experiment.
So, on to the experiment. This first picture shows a standard 40-watt incandescent light bulb without power applied. The lead of the thermocouple measuring device is taped to the glass surface of the bulb with heat-resistant tape made for this purpose. The meter registers 23.2C. In addition, a professional-grade infrared thermometer is aimed at the bulb, showing a temperature of 72F. (I could not get it to change the units of the display to Celsius.) Note that throughout the experiment, the thermocouple measurements are the key ones.
Next, the standard North American voltage of 120 volts AC (measured as 120.2V) was applied to the bulb, which was standing in free air on a table top. The system was allowed to come to a new thermal equilibrium. At this new equilibrium, the thermocouple registered 93.5C. (The IR thermometer showed a somewhat lower 177F, but remember that its reported temperature makes assumptions about the emissivity of the object.)
Next, a clear cubic glass container about 150mm (6”) on a side, initially at the room temperature of 23 C, was placed over the bulb, and once again the system was allowed to reach a new thermal equilibrium. In this state, the thermocouple on the temperature of the bulb registers 105.5C, and the outer surface of the glass container registers 37.0C (equivalent to body temperature).
The glass container permits the large majority of the radiative energy to escape, both in the visible portion of the spectrum (obviously) and in the near infrared, as standard glass is highly transparent to wavelengths as long as 2500 nanometers (2.5 microns). However, it does inhibit the direct free convection losses, as air heated by the bulb can only rise as far as the top of the glass container. From there, it must conductively transfer to the glass, where it is conducted through the thickness of the glass, and the outside surface of the glass can transfer heat to the outside ambient atmosphere, where it can be convected away.
The next step in the experiment was to wrap an aluminum foil shell around the glass container. This shell would not permit any of the radiative energy from the bulb to pass through, and would reflect the large majority of that energy back to the inside. Once again the system was allowed to reach thermal equilibrium. In this new state, the thermocouple on the surface of the bulb registered 137.7C, and the thermocouple on the outer surface of the glass registered 69.6C. The infrared thermometer is not of much use here due to the very low emissivity (aka high reflectivity) of the foil. Interestingly, it did show higher temperatures when focused on the tape on the outside of the foil than on the foil itself.
Since adding the foil shell outside the glass container could be reducing the conductive/convective losses as well as the radiative losses, the shell was removed and the system with the glass container only was allowed to re-equilibrate at the conditions of the previous step. Then the glass container was quickly removed and the foil shell put in its place. After waiting for thermal equilibrium, the thermocouple on the surface of the bulb registered 148.2C and the thermocouple on the outside of the foil registered 46.5C. The transient response (not shown) was very interesting: the temperature increase of the bulb was much faster in this case than in the case of adding the foil shell to the outside of the glass container. Note also how low the infrared thermometer reads (84F = 29C) on the low-emissivity foil.
Further variations were then tried. A foil shell was placed inside the same glass container and the system allowed to reach equilibrium. The thermocouple on the surface of the bulb registered 177.3C, the thermocouple on the outer surface of the foil registered 67.6C, and the infrared thermometer reading the outside of the glass (which has high emissivity to the wavelengths of ambient thermal radiation) reads 105F (40.6C).
Then the glass container was removed from over the foil shell and the system permitted to reach equilibrium again. The thermocouple on the surface of the bulb registered 176.3C and the thermocouple on the outside of the foil registered 50.3C.
All of the above examples used the reflected shortwave radiation from the aluminum foil. What about absorbed and re-emitted longwave radiation? To test this, a shell of black-anodized aluminum plate, 1.5mm thick, was made, of the same size as the smaller foil shell. A black-anodized surface has almost unity absorption and emissivity, both in the shortwave (visible and near infrared) and longwave (far infrared). Placing this over the bulb (without the glass container), at equilibrium, the thermocouple on the bulb registered 129.1C and the thermocouple on the outside of the black shell registered 47.0C. The infrared thermometer read 122F (50C) on the tape on the outside of the shell.
The power source for this experiment was the electrical input. The wall voltage from the electrical grid was steady at 120.2 volts. The electrical current was measured under several conditions with a professional-grade clip-on current sensor. With the bulb in open air and a surface temperature of 96.0C, the bulb used 289.4 milli-amperes of current.
With the bulb covered by a foil shell alone and a surface temperature of 158.6C, the bulb drew slightly less, 288.7 milliamperes.
Summary of Results
The following table shows the temperatures at equilibrium for each of the test conditions:
| Condition | Bulb Surface Temperature | Shell Temperature |
| Bulb open to room ambient | 95C | — |
| Bulb covered by glass container alone | 105C | 37C |
| Bulb covered by glass container and outer reflective foil shell | 138C | 70C (glass) |
| Bulb covered by outer reflective foil shell alone | 148C | 46C (foil) |
| Bulb covered by inner reflective foil shell inside glass container | 177C | 68C (foil) |
| Bulb covered by inner reflective foil shell alone | 176C | 50C |
| Bulb covered by black-anodized aluminum shell alone | 129C | 47C |
Analysis
Having multiple configurations permits us to make interesting and informative comparisons. In all cases, there is about a 35-watt (120V x 0.289A) electrical input to the system, and thermal equilibrium is reached when the system is dissipating 35 watts to the room as well.
I used a low-wattage (40W nominal) bulb because I had high confidence that it could take significant temperature increases without failure, as it has the same package design as much higher-wattage bulbs. Also, I would not be working with contraband high-wattage devices 😉
The case with the glass container alone is the important reference case. The glass lets virtually all of the radiant energy through, while inhibiting direct convection to the room ambient temperature of 23C. Conductive/convective losses must pass from the surface of the bulb, through the air under the container, to and through the glass, and then to the room atmosphere, where it is conducted/convected away. Under these conditions, the bulb surface temperature is 105C, which is 10C greater than when the bulb can conductively dissipate heat directly to the room atmosphere.
Compare this case to the case of the larger foil shell alone. The foil shell also inhibits direct conductive/convective losses to the room atmosphere, but it will not inhibit them to any greater extent. In fact, there are three reasons why it will inhibit these losses less than the glass container will. First, the material thermal conductivity of aluminum metal is far higher than that of glass, over 200 times greater (>200 W/(m*K) versus <1.0 W/(m*K)). Second, the foil, which is a small fraction of a millimeter thick, is far thinner than the glass container, which is about 4 mm thick on average. And third, the surface area of the foil is somewhat larger than the glass container, so it has more ability to conductively transfer heat to the outside air.
And yet, the surface of the bulb equilibrated at 146C under these conditions, over 40C hotter than with the glass container. With conductive/convective losses no less than with the glass container, and very probably greater, the only explanation for the higher temperature can be a difference in the radiative transfer. The glass container lets the large majority of the radiation from the bulb through, and the foil lets virtually none of it through, reflecting it back toward the bulb. The presence of the foil, which started at the room ambient of 23C and equilibrated at 46C, increased the temperature of the bulb, which started at 105C on the outside (and obviously warmer inside). The reflected radiation increased the temperature of the bulb, but did not produce “endless warming”, instead simply until the other losses that increase with temperature matched the input power of 35 watts.
Interestingly, the foil shell without the glass container inside led to a higher bulb temperature (148C) than the foil shell with the glass container inside (138C). Two layers of material around the bulb must reduce conductive/convective losses more than only one of them would, so the higher temperature must result from significantly more reflected radiation back to the bulb. With the glass inside, the reflected radiation must pass through two surfaces of the glass on the way back to the bulb, neither of which passes 100% through.
Another interesting comparison is the large foil shell that could fit outside of the glass container, about 160mm on a side, with the small foil shell that could fit inside the glass container, about 140mm on a side. With the large shell alone, the bulb temperature steadied at 148C; with the smaller shell, it steadied at 176C. With all direct radiative losses suppressed in both cases, the difference must come from the reduced surface area of the smaller shell, which lessens its conductive/convective transfer to the outside air at a given temperature difference. This is why halogen incandescent light bulbs, which are designed to run hotter than standard incandescent bulbs, are so much smaller for the same power level – they need to reduce conductive/convective losses to get the higher temperatures.
All of the above-discussed setups used directly reflected radiation from the aluminum foil. What happens when there is a barrier that absorbs this “shortwave” radiation and re-emits it as “longwave” radiation in the far infrared? Can this lead to higher temperatures of the warmer body? I could test this using black-anodized aluminum plate. Black anodizing a metal surface makes it very close to the perfect “blackbody” in the visible, near-infrared, and far-infrared ranges, with absorptivity/emissivity (which are the same at any given wavelength) around 97-98% in all of these ranges.
With a black plate shell of the same size as the smaller foil shell, the bulb surface temperature equilibrated at 129C, 24C hotter than with the glass container alone. Once again, the thin metal shell would inhibit conductive/convective losses no better, and likely worse than the glass container (because of higher material conductivity and lower thickness), so the difference must be from the radiative exchange. The presence of the shell, which started at the room ambient of 23C and increased to 47C, caused the bulb surface temperature to increase from 105C to 129C.
Another interesting comparison is that of the smaller foil shell, which led to a bulb surface temperature of 176C and a shell temperature of 50C, to the black plate shell of the same size, which led to a bulb surface temperature of 129C and a shell temperature of 46C. While both of these create significantly higher bulb temperatures than the glass container, the reflective foil leads to a bulb surface temperature almost 50C higher than the black plate does. Why is this?
Consider the outside surface of the shell. The foil, which is an almost perfect reflector, has virtually zero radiative absorptivity, and therefore virtually zero radiative emissivity. So it can only transfer heat to the external room by conduction to the air, and subsequent convection away. The black plate, on the other hand, is virtually the perfect absorber and therefore radiator, so it can dissipate a lot of power to the room radiatively as well as conductively/convectively. Remember that, since it is radiating as a function of its own temperature, it will be radiating essentially equally from both sides, there being almost no temperature difference across the thickness of the plate. (Many faulty analyses miss this.) The foil simply reflects the bulb’s radiation back to the inside and radiates almost nothing to the outside. This is why the infrared thermometer does not read the temperature of the foil well.
The electrical voltage and current measurements were made to confirm that the increased temperature did not come from a higher electrical power input. The current measurements shown above demonstrate that the current draw of the bulb was no higher when the bulb temperature was higher, and was in fact slightly lower. This is to be expected, since the resistivity of the tungsten in the filament, as with any metal, increases with temperature. If you measure the resistance of an incandescent bulb at room temperature, this resistance is less than 10% of the resistance at its operating temperature. In this case, the “cold” resistance of the bulb is about 30 ohms, and the operating resistance is about 415 ohms.
Let’s look at the dynamic case, starting with the thermal equilibrium under the glass container alone. 35 watts are coming into the bulb from the electrical system, and 35 watts are leaving the bulb through conductive losses to the air and radiative losses to the room through the glass. Now we replace the glass with one of the metal shells. Conductive losses are not decreased (and may well be increased). But now the bulb is receiving radiant power from the metal shell, whether reflected in one case, or absorbed and re-radiated back at longer wavelengths in the other. Now the power into the bulb exceeds the power out, so the temperature starts to increase. (If you want to think in terms of net radiative exchange between the bulb and the shell, this net radiative output from the bulb decreases, and you get the same power imbalance.)
As the temperature of the bulb increases, both the conductive losses to the air at the surface of the bulb increase (approximately proportional to the temperature increase) and the radiative losses increase as well (approximately proportional to the 4th power of the temperature increase). Eventually, these losses increase to where the losses once again match the input power, and a new, higher-temperature thermal equilibrium is reached.
I originally did these tests employing a cylindrical glass container 150mm in diameter and 150mm high with and without foil shells, and got comparable results. In the second round shown here, I changed to a cubic container, so I could also create a black-plate shell of the same shape.
It is certainly possible that improvements to these experiments could result in differences of 1 or 2C in the results, but I don’t see any way that they could wipe out the gross effect of the warming from the “back radiation”, which are several tens of degrees C.
All of these results are completely in line with the principles taught in undergraduate engineering thermodynamics and heat transfer courses. The idea that you could inhibit net thermal losses from an object with an internal power source, whether by conductive, convective, or radiative means, without increasing the temperature of that object, would be considered ludicrous in any of these courses. As the engineers and physicists in my group came by the lab bench to see what I was up to, not a single one thought for a moment that this back radiation would not increase the temperature of the bulb.
Generations of engineers have been taught in these principles of thermal analysis, and have gone on to design crucial devices and infrastructure using these principles. If you think all of this is fundamentally wrong, you should not be spending your time arguing on blogs; you should be out doing whatever it takes to shut down all of the erroneously designed, and therefore dangerous, industrial systems that use high temperatures.
Conclusions
This experiment permitted the examination of various radiative transfer setups while controlling for conductive/convective losses from the bulb. While conductive/convective losses were not eliminated, they were at least as great, and probably greater, in the cases where a metal shell replaced the glass shell over the bulb.
Yet the bulb surface temperature was significantly higher with each of the metal shells than with the glass shell. The only explanation can therefore be the radiative transfer from the shells back to the bulb. In both cases, the shells were significantly cooler than the bulb throughout the entire experiment, both in the transient and equilibrium conditions.
We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for. This is true both for reflected radiation of the same wavelengths the warmer body emitted, and for absorbed and re-radiated emissions of longer wavelengths. The temperature effects are so large that they cannot be explained by minor setup effects.
Electrical measurements were made to confirm that there was not increased electrical power into the bulb when it was at higher temperatures. In fact, the electrical power input was slightly reduced at higher temperatures.
This experiment is therefore compatible with the standard radiative physics paradigm that warmer and cooler bodies can exchange radiative power (but the warmer body will always transfer more power to the cooler body). It is not compatible with the idea that cooler bodies cannot transfer any power by radiative means to warmer bodies and cause an increase in temperature of the warmer body.
=====================================
UPDATE: The Principia/Slayers group has post a hilarious rebuttal here:
http://principia-scientific.org/supportnews/latest-news/210-why-did-anthony-watts-pull-a-bait-and-switch.html
Per my suggestion, they have also enabled comments. You can go discuss it all there. – Anthony
You really need to go look into what makes stars do what they do, and then consider that each star is in fact in the process of irradiating the other star. I can’t imagine what you could possibly mean by “losing its own heat”, because both stars are constantly in the process of losing their heat. They can’t NOT lose their own heat.
I don’t know that statistics of close binaries have been kept, but I would be shocked if they didn’t live less long than single stars of the same type, because they are in fact heating each other.
Look, only a hypothetical perfect reflector can fail to gain energy when exposed to a source of radiation. Are you saying one or the other of those stars is a perfect reflector?
Carrick says:
May 28, 2013 at 10:31 am
——–
If you think about it, what’s likely to happen if the filament temperature rises and current drops is the temperature of the filament will self-regulate within bounds. The power will drop slightly, which is probably the only thing that keeps the filament from overheating.
Thermodynamics requires that a direct transfer of heat by conduction go from the higher temperature to the lower.
But in the case of two bodies, if both are radiating, of course they are going to transfer energy. Imagine two bodies in space, separated by one foot that are radiating infrared. If you have Body A putting out 1kW radiatively and Body B putting out 500W radiatively, Body B is going to transfer X amount of joules to Body A, which would raise the temperature. (I could calculate how much, but it’s been years since I took Thermo…) Body A is also going to transfer Y amount of joules to Body B.
Your conclusion is fundamentally flawed. The statement: “We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for.” is false. What you have is solid experimental evidence that reducing the amount of energy (heat) leaving a system while maintaining the amount of energy (heat) entering the system will result in a temperature rise in the system. The temperature rise is due to the increasing amount of heat in the system and this heat is coming from the electric current being supplied to the bulb, not from the back radiation.
1) temperature rises
2) resistance increases
3) P=V^2/R decreases
4) temperature rate of increase slows
5) Goto 1)
Seeing that the position of the PSI is that the proximity of a cooler object cannot cause a hotter one to get even hotter, which is something that I did not realize before due to its absurdity, I wonder why they have restricted it to thermal radiation. Surely, to be consistent, they must also apply this principle to conduction and convection as well, since they are all methods of thermal transfer. This being the case, how does invoking convective restriction as the explanation of the present experimental result help their case. When we accept this logical extension of the PSI theory most reasonable people should see the flaw since we are all very familiar with the benefits of blankets on a cold winters night. But since thermal radiation is a more abstract concept there may be either a tendency to ignore it or to descend into magical thinking aided by an endless stream of non sequiturs. There may also be an equating of conservation of energy with conservation of temperature. The latter does not. Temperature is a statistical parameter (one definition).
So, your theory is that this heat coming from the electric current being supplied to the bulb knows when there’s a reflector in place?
If not, what?
Author (Curt), needed now is a list of test equipment used (manf, and model number.) Some of us would like to look up the specs on such items as the clip-on ammeter and the method the AC voltage is calculated internally by the DVM (e.g. an ‘average’ value corrected to read RMS, or a true RMS reading/calculation technique). Thank you for this in advance, and also your work on this subject here today as well.
I assume the assembled multitude here is also cognizant of the dynamic change in filament R (resistance) over a complete cycle of applied AC, where the peak is on the order of 1.414 times the nominal 120 V RMS value, resulting in a figure around 170 Volts. This results in a peak current nearly coincident with the peak of the applied AV voltage, and falls off thereafter in a nonlinear manner (no longer observes the simple E = I/R relationship, but one with time as a parameter.)
This nonlinear response is MOST noticeable if one ‘powers’ a bulb during only one half (1/2) of the applied AV mains sinusoid (using a simple series diode), one can see a really lop-sided current ‘draw’ on the scope trace which is set to monitor the *dynamic’ current drawn by the bulb.
THAT is why I mentioned the Kill-A-Watt appliance measuring ‘instrument’ which calculates RMS power (Root of the Mean of the sum of Squares of a time series of measured values) consumed, as opposed to making use of ‘apparent power’ (simple I times E) calculation usually using the simple ‘peak’ reading obtained from a digital Volt or current meter (DMM or DVM).
(Better yet, utilize nowadays an NI DAQ (analog data acquistion) card and digitize the I and V values out to 12 or 14 bits over a complete waveform (at say, 500 us intervals) and analyze the ‘slope change’ of the I draw and calculate the temp change in the Tungsten filaments that way.)
Experiments in this area were undertaken in the early eighties looking at different means to extend Incandescent bulb life. A Tektronix 564B Mod121N O-scope was used in conjunction current ‘shunts’ to facilitate observations of bulb current-draw behavior over time under different applied AC waveforms.
.
I mean, the filament is not producing any more heat when the cover is on than when the cover is off.
Sincerely baffled, here.
TO Michael Tremblay
“What you have is solid experimental evidence that reducing the amount of energy (heat) leaving a system while maintaining the amount of energy (heat) entering the system will result in a temperature rise in the system.”
OMG you finally understood how it works! The sun is providing a (so to say) constant energy input, the GHGs reduce the amount of energy being released to space… And voilà!!!
Here’s another conceptually simple experiment that slays the slayers. Take 2 identical steel spheres, put two diametrically opposite embedded thermometers on each of them. Heat one of the spheres to 2,000 deg C and leave the other at room temperature. Then hang each of them closely together, but not touching in a vacuum, with one thermometer on each of them as close as possible to the nearest thermometer on the other, and the other 2 thermometers as far apart as possible.
Then measure how the temperature changes at each of the 4 thermometers as the spheres are allowed to approach thermal equilibrium. For the first part of the experiment the thermometer on the hot sphere closest to the cool sphere will cool more slowly than the thermometer on the hot sphere furthest from the cold sphere. The thermometer on the cold sphere closest to the warm sphere will warm more quickly than the thermometer on the cold sphere furthest from the hot sphere. As time passes there will come a points in time where the each of the thermometers on the cold sphere will start to cool after warming. Eventually both spheres will be in thermal equilibrium with each other and all thermometers will read the same.
The reason that the closest thermometer on the hot sphere cools more slowly is because some back radiation comes from the warming cold sphere. This slays the slayers.
There is no convection in the experiment, there is no conduction between spheres just within each sphere. There is no ongoing energy supply so the simplest form of the laws of radiative thermodynamics will apply. The experiment is symmetric. The Boltzmann profile of each sphere is identical.
If someone were to posit that conduction in the hot sphere were to keep both its thermometers cooling at the same rate then as the 2 thermometers would be at the same temperature there would be no net conduction but extra photons are arriving on the side closest to the cooler sphere so it must cool more slowly.
Problem being, this varies dynamically over the applied AC sinusoid even; above is a post where I propose to measure the bulb filaments characteristic over an applied cycle in 500 us steps (using DAQ or data acquisition cards), then the instantaneously measured V / I performance of the filament could be used to calculate it’s instantaneous R, and infer the change in temperature.
In lieu of such difficult measurement setup and/or apparatus, simply power the bulb from a DC power supply and do away with the dynamic thermal performance of the tungsten filament over the course of an applied AC sinewave … the values then read by simple instrumentation (simple DC volt and amp meters) will need no correction as the filament measurement conditions will be ‘static’ as opposed to ‘dynamic’ at a 60 Hz rate.
.
Anton Eagle says:
May 28, 2013 at 9:29 am
If you’re going to get the science wrong… then at least be internally consistent.
Anton and Gentlemen,
Our writer says “in every case” and states the current to three decimal places. Clearly he is telling us the filament temperature does not change. The bulb temperature is affected by a complex flux, of course it will change. Those who state that the bulb temperature is a better analog to the Earth’s surface have not even entered the debate. The only question of Physics here is, can a heated filament increase its own temperature by being exposed to its reflected IR? If it could we would certainly be able to lower our electric bills. Just think, an Einsteinian thought experiment: Say the filament temperature DID go up. Would it not consequently, in the most Stefan-Boltzmann’s Law way, radiate more heat and light, which would be reflected more strongly, increasing its temperature still more in a runaway cycle? Will the Perpetual Motion patent holder please tell us how he did it?
Photons of course have a frequency, and a “wavelength,” please do not forget this dear readers. Otherwise CO2 could absorb all photons, instead of just those at the 15-micron and its other resonant wavelengths.
Somebody certainly did get the science wrong…
So, you are comparing a closed system to an open system and making conclusions about proof.
Interesting that everything was tested except the one item you are trying to prove has this certain quality, a greenhouse gas.
Lets see if we get this correct… Glass retains the heat and temperature goes up, aluminum foil reflects directly the exact same energy power as well as retaining the heat and the temperature goes up faster, putting glass between the reflective foil causes some of that energy to be lost as it goes through the absorbent glass twice.
This is science? It seems more like a vengeance attack against your enemy that is more or less making you look more the fool than your target.
Michael Tremblay:
It helps if you start by using the correct language to describe the problem:
Technically the electric current is providing a fixed rate of power that is providing a (nearly) fixed rate of thermal energy per unit time. For the temperature to increase, the net amount of thermal energy stored in the system would have to increase. “Heat” is a thermodynamic term relating to a process variable, namely it refers to the amount of heat energy exchanged between two bodies during one cycle of a thermodynamic process.
The fact that the temperature of the system increases when you manipulate the system by e.g. putting a reflective aluminum shroud around it tells you that something has changed to allow more thermal energy to be stored in an equilibrium state. That is something is impeding the outwards flow of thermal energy per unit time from the heat energy source.
As we see from the measurements, the amount of thermal energy stored actually increased while the amount of available electrical power decreased (slightly). The explanation cannot be simply that there is a power source internal to the system, because that fails to explain why the heat energy is being affected by the different experimental configurations.
It must be that there is a physical mechanism associated with the aluminum foil that is causing the temperature to increase more than with e.g. the black-anodized metal container.
Since this science has already been well tested experimentally, the most plausible explanation is that radiative physics is involved,
The only thing missing from this experiment is the control needed to actually calculate the amount of warming observed under the different experimental conditions. A quantitative comparison to a physics-based model, would make this publishable in e.g. American Journal of Physics, since they welcome physics demonstrations that are suitable for classroom environments.
Roy Spencer said: “Michael Moon, you make the same fundamental mistake the Slayers do…equating energy input to temperature. Temperature is NOT determined by the rate of energy input alone.”
I was going to point this out in the last thready but didn’t. It appears these guys do not know the difference between “heat” and “temperature”. They are absolutely NOT the same thing…
Michael Tremblay says:
May 28, 2013 at 10:44 am
“Your conclusion is fundamentally flawed. The statement: “We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for.” is false. What you have is solid experimental evidence that reducing the amount of energy (heat) leaving a system while maintaining the amount of energy (heat) entering the system will result in a temperature rise in the system.”
How can the amount of energy leaving the system be less than the amount entering, if the system is at equilibrium?
The temperature of the bulb increases until equilibrium is reached, at which point energy leaving the system (at the surface of the foil box) equals energy entering.via the filament, which is assumed to have remained constant.
In fact, the energy leaving the system never changed, and neither did the energy entering. What changed was the size of the system, moving from the filament and bulb, to filament, bulb and foil box.
Slartibartfast says:
May 28, 2013 at 10:22 am
Heh, your comment made me wonder whether Slayers eat microwave dinners, or whether their food is too exclusive for photons at the wavelength of water.
Another thought experiment. Two stars, 4 light-years apart. Measure the radiated energy of each, and sum them together. Now move them closer together by half the distance. The energy they impinge on each other increases by 4 (inverse square law). Does the sum of the radiated energies for the pair *as a system* increase? What about 1 light-year? Half a light year?
astonerii:
Heat energy is being lost in all cases, so it’s alway an open system.
Might consider redoing this using a clear bulb. That would eliminate the effects of the interior coating of the bulb. Otherwise, very interesting indeed.
Vince:
I think you mean in equilibrium, right? Clearly when the interior is heating up the amount of heat energy per unit time leaving would be less than at equilibrium.
This whole filament/bulb argument is misdirection. The bulb is a grey body heated by a source; whether that source is the filament or the sun makes no difference. Bottom line is that the bulb temperature increases in the presence of a colder outer layer, thus the colder outer layer warmed (or slowed the cooling of if you prefer) the inner layer. Sky-Dragon-Slayer premise busted!
Micheal Moon:
Clearly you don’t have any physics training.
Once again you are confusing electrical power (which actually isn’t a constant, it decreases slightly as the filament heats up) with temperature, which is a measure of thermal energy.
“We therefore have solid experimental evidence that radiation from a cooler object (the shell) can increase the temperature of a warmer object (the bulb) with other possible effects well controlled for.”
“Radiation from a cooler object”? That statement ought to dismiss anything else written in the essay. A cooler object doesn’t radiate energy to a warmer one.