warm room and then took them outside early enough to cool off before the game, the balls would end up about 2 psi below the designated pressure.—
We can run the numbers:
P1 x V1 / T1 = P2 x V2 / T2
T is absolute temp
if 75 f inside
if 45 f outside
use P1 minimum of 12.5
Assume V is constant (probably not quite)
12.5 / (75+460) = P2 / (45+460)
P2 = 11.8
Looks like >2 psi means tampering
Yes, but realize that your calcs are using the “Ideal Gas” law...and as we know, no gas is “ideal” [especially when released under the bed-covers, according to my wife!].
Using several differential equations, we might be able to verify a change closer to...say..1 psi (12.5 v 11.5)?
Sorry, just arguing from a Patriots fan perspective.
:-)
Allowing for the detail that the balls would be somewhat warmer than 75 degrees F if they were pumped up, since a hand pump makes the air warmer as it compresses, I come out in the vicinity of 2 psi. Of course, it depends on indoor and outdoor temperatures too, air lost as the needle is removed, and variations among pressure gauges.
I made the same mistake on another thread.
Need to use absolute pressures, not gage pressures.
I also looked at the assumption of constant V. If the volume of the football changes by 5% or less, then cooling accounts for all the loss. If the volume changes by 10%, then the cooler weather doesn’t account for much of the deflation at all.