The following discussion from American Thinker comments may help explain the article:
It is a fundamental scientific principle — one of many violated flagrantly by modern scientists — that you SHOULD NOT consider the final calculation to more “significant digits” than the original data.
So let’s say you are simply calculating an average measurement. For the high school junior varsity football team you have weights on file of:
182.456 pounds, 179 pounds, 165.44 pounds, 189.4243543 pounds, 185 pounds, 187.34 pounds, 193.324234 pounds, 190.43 pounds, 188.45 pounds, 169.88 pounds, and 174.54 pounds
The only scientifically correct average is 182 pounds.
The only scientifically correct calculation is: (182 + 179 + 165 + 189 + 185 + 187 + 193 + 190.43 + 188.45 + 170 pounds + 174.54 / 11 = 182
The greater accuracy of some of the measurements must be ignored.
The presence of less accurate measurements in the calculations renders the greater accuracy of some measurements completely meaningless and deceptive.
If you report the average to the greater level of accuracy, it will be a false and misleading result.
That in a nutshell is Anthropogenic [Mann-made] Global Warming: GUESSES from observing bacteria content in fossils, from tree rings, and from gas samples trapped in ancient ice are being combined with modern thermometer readings.
I think I know where you are going with that but I am not sure. Could you explain it to me, please?
A corollary: If something weighs 182.00 punds, don’t forget to write down the 2 decimal places - they matter (despite what Microsoft Excel thinks).
Depends entirely on the accuracy of the scale’s reporting device. If it’s 182 pounds and you are using a load cell that has a processor that gives you 10 digits of precision, then it’s 182.0000000000 pounds ~ so were you rounding off the readout from a digital scale or what?