This article says that NASA uses 15 digits after the decimal point, which I’m counting as 16 in total, since that’s how we count significant digits in scientific notation. If you round pi to 3, that’s one significant digit, and if you round it to 1, that’s zero digits.
I know that 22/7 is an extremely good approximation for pi, since it’s written with 3 digits, but is accurate to almost 4 digits. Another good one is √10, which is accurate to a little over 2 digits.
I’ve heard that ‘field engineers’ used to use these approximations to save time when doing math by hand. But what field, exactly? Can anyone give examples of fields that use fewer than 16 digits? In the spirit of something like xkcd: Purity, could you rank different sciences by how many digits of pi they require?
A 64 bit IEEE float has 53 significant bits (the “mantissa” or “significand”), and log10(253) is 15.9546.
Isn’t it just 15 significant figures then?
I would round up to 16.
Yeah I wasn’t sure if it would be correct to throw out the exponent entirely or if it might end up contributing some amount to the final accuracy of the number. I hadn’t spent a lot of time thinking about the problem.
Yeah the exponent just allows you to represent lots of magnitudes, but it wouldn’t contribute to the accuracy because you basically have 1.xyz * 2exponent. So the xyz significand is the only part that counts for significant digits. Although I guess in some sense you are partially right, because the exponent exists it is assumed that the first bit is always one, since otherwise you would just adjust the exponent to the first one, so only 52 bits have to be stored.