Guest post by Alec Rawls
Just fill the ball with warm humid indoor air, then when it temperature-equalizes with the 25°F cooler outdoor air on your AFC Championship playing field some of the water vapor in the ball will condense into water, leaving less air in the ball, solving the great mystery: how did the footballs used by the Championship winning New England Patriots show 12.5 psi of inflation pressure in the official pre-game check but only 10.5 psi when checked at halftime?
There is also a decrease in pressure due to the cooling of the molecules that remain gaseous. Those air molecules are not zipping around as fast as they were so they exert less outward pressure on the ball. But according to the ideal gas law, if there were no reduction in the number of gas molecules in the balls it would have taken a large drop in temperature, about 40°F, to cause the observed drop in air pressure. So says Boston College professor Martin Schmaltz:
In order for a ball to register a 10.5 PSI in a 50 degree environment [the temperature on the field at halftime] but register a 12.5 PSI in the testing environment, the ball would have to have been inflated, stored, and/or tested in a 91 degree environment.
I verify Schmaltz’s calculations at the end of this post, and while I’m no expert in the field, I get the same answer he does.
It wouldn’t be hard to deliver balls to the pre-game pressure check with 91° air inside. Just inflate them in a 100° sauna shortly before testing, but the Patriots are adamant that they do not know why the air pressure in their balls was low at halftime and if they had inflated their game balls in a sauna they would certainly know it.
The Carnegie Mellon experiment
An experiment performed by a team at Carnegie Mellon provides empirical support for the Patriots’ claim to have done nothing unusual. The Carnegie experimentalists inflated a batch of footballs to 12.5 psi at a room temperature of 75°F, then let the balls equalize to a new ambient temperature of 50°F, resulting in an average pressure drop of 1.8 psi. (They also wet the leather balls to simulate the rainy conditions of the game, surmising that this might allow stretching that would reduce air pressure in the ball, but this seems likely to be a minor factor.) The Carnegie experiment is video-documented here:
So how to account for the difference between the Carnegie findings and the ideal gas law, which predicts that a much larger decrease in temperature would be needed to create the observed pressure drop? Barring experimental error, it seems that the difference would have to be explained by condensation. Gas was removed from the ball, not via an inflation needle but by conversion to liquid water. What do our blog-reading experts say? Is this the likely explanation?
The Carnegie group was not monitoring humidity (at least in the short video above), but if this is the explanation for their greater-than-ideal pressure drop then it could easily have happened to the Patriots the same way without anyone intentionally manipulating the inflation temperature or humidity. Still…
It must be common knowledge around the league that indoor inflation yields a softer game ball
The fact that the Colts’ balls did not show a similar pressure drop suggests that teams do know how to make these manipulations. Just as Patriots’ quarterback Tom Brady prefers to throw a less inflated ball, other quarterbacks
are known to prefer harder footballs.
If Colts quarterback Andrew Luck prefers a harder ball then all the Colts had to do is fill their balls pre-game with cool outdoor air. Ambient outdoor temperatures actually rose from pre-game to halftime so the temperature effect would have made their balls firmer. Also, moisture beyond what the cooler air could hold would never have made its way into the ball in the first place so wouldn’t there be any pressure-reducing condensation inside the ball either.
Players and equipment managers would surely have noticed over the decades how the conditions in which balls are inflated to regulation pressures affect ball firmness on the field. The basics are hard to miss. In cold conditions, inflate outdoors to get a firm ball, indoors to get a softer ball.
The existing pressure-test regimen, intentionally or not, leaves this room for teams to manipulate ball pressure to suit their preferences. The rule just says that air cannot be put into or removed from the ball after the pre-game pressure check. It does not regulate the conditions in which the balls are inflated going into the pre-game pressure check.
If Coach Belichick had exploited this loophole to the max by inflating balls in the sauna then there would be a legitimate question whether this rule-bending constitutes cheating and there is plenty of history, both recent and ancient, to indicate that Belichick is eager to wring every advantage out of a loophole that he can. Where others may see exploiting loopholes as cheating, Belichick sees it as part of the game.
By the time he is done the NFL rule-book will contain at least a few “Belichick rules,” closing the loopholes he has so nicely pointed out, most recently by confusing the Baltimore Ravens about which Patriots players were eligible to receive passes. “It’s not something that anybody has ever done before,” complained Ravens coach John Harbaugh, “I’m sure the league is going to look at it and make some adjustments.”
Belichicks’ reward (besides a trip to the AFC Championship): he is now tied with Tom Landry for the most post-season coaching wins in league history, to which I say GO PATRIOTS! (That’s what you call “full disclosure.”)
But the full explanation in the present case seems to be that the Patriots filled their game balls with indoor air. If that is manipulation at all it must be utterly commonplace and well within the rules.
The biggest loser: Bill Nye, the phony-science guy
While real scientists keep acknowledging that the move from inside to outside can cause a substantial drop in football psi, Nye went on national television to proclaim that air must have been taken out of the balls with a needle. So that’s good anyway. Half the Northeast now knows that Bill Nye is an idiot.
Addendum: Gas law calculations
I was looking up how to calculate the expected pressure drop in a ball for a given temperature drop when I came across the claim from Boston College physicist Martin Schmaltz that, following the ideal gas law, temperature inside the balls would have had to be 91°F during the pre-game pressure check to account for the 2 psi drop in air pressure by halftime. In the exercise below I come up with a similar answer but I have no background in this stuff and am just following readily available information so don’t take my explication on authority (and please do note any inaccuracies in the comments).
When the number of gas molecules in a container is fixed (no gas escaping out through the bladder and no gas converting to liquid via condensation) then the ideal gas law simplifies to the general gas law, also called the combined gas law. Like the ideal law, the general law is said to be close to accurate so long as extreme pressures or temperatures are not involved. Mathematically, the general law just says that gas temperature, volume and pressure all vary in direct proportion to each other:
(P1V1)/T1 = (P2V2)/T2, where P1 is pressure at time 1, V1 is volume at time 1, and T1 is temperature at time 1.
In plain language, for the gas pressure in the Patriots’ footballs to drop by 7% the general gas law says that the temperature of the air in the balls must drop by 7% or the volume inside the ball must increase by 7% or there must be a combination of percentage changes in temperature and volume that add to 7.
The problem can be simplified further by assuming (as Professor Schmaltz does) that the volume of the space inside the football remains constant. (This won’t be fully accurate. When pressure in a ball drops the volume inside the ball will drop a small amount. This shrinking of the ball will make pressures higher in the low pressure state than they would be if the ball didn’t shrink so the constant-volume estimate of the temperature change required to account for the observed pressure drop will be a bit on the low side, unless the Carnegie experimentalists are correct and there is an offsetting increase in volume when the balls get wet.)
With fixed volume the general gas law becomes: P1/T1 = P2/T2
All of these numbers are known except for T1, the temperature of the air in the ball when it was first tested 2 hours before game-time. The known numbers just have to be converted from relative to absolute form.
First, the inflation pressures measurements are in pounds per square inch above atmospheric pressure, thus to get the full pressure inside a ball it is necessary to add atmospheric pressure (about 14.7 psi) to the measured psi.
Also, the gas law is based on degrees above absolute zero, which for Fahrenheit-sized degrees are “degrees Rankine,” which are Fahrenheit + 460. Solving for T1 in degrees Rankine:
T1 = (P1 x T2)/P2 = ((12.5 + 14.7) (50 + 460))/(10.5 + 14.7) = (27.2 x 510)/25.2 = 550.5°R = 90.5°F
Which rounds up to Professor Schmaltz’s 91°F.
Calculations for the Carnegie-Mellon experiment
In the Carnegie-Mellon experiment the expected post-equalization ball pressure, calculated just using the general gas law (where no gas is lost to condensation), is:
P2 = (P1 x T2)/T1 = [(12.5 + 14.7) x (50 + 460)]/(75 +460) = 25.9 psi
Subtract atmospheric pressure (14.7 psi) to get an expected pressure test reading of 11.2 psi, vs. actual experimental readings of 10.7 psi. The suggestion here is that the additional pressure drop found in the Carnegie experiment is a result of water vapor condensation.
If the Carnegie experimentalists were careful they would have compensated for the pressure drop that comes from energizing their pressure tester but game officials (who measured halftime pressure as 10.5 psi) might well not have taken this source of pressure loss into account. If they had the then the difference between their measured pressure drop of 2 psi and Carnegie’s measured drop of 1.8 psi might disappear.