Precision[x] gives the effective number of digits of precision in the number x.
Precision[x] gives a measure of the relative uncertainty in the value of x.
With absolute uncertainty dx, Precision[x] is -Log[10, dx/x].
For exact numbers such as integers, Precision[x] is Infinity.
Precision[x] does not normally yield an integer result.
For machine-precision numbers Precision[x] yields MachinePrecision.
Numbers entered in the form digits`p are taken to have precision p.
Numbers such as 0``a whose overall scale cannot be determined are treated as having zero precision.
Numbers with zero precision are output in StandardForm as , where is their accuracy.
If x is not a number, Precision[x] gives the minimum value of Precision for all the numbers that appear in x.
See Section 3.1.4.
See also: Accuracy, N, Chop, SetPrecision, MachineNumberQ.
New in Version 1; modified in 5.0.