Accuracy[x] gives the effective number of digits to the right of the decimal point in the number x.
Accuracy[x] gives a measure of the absolute uncertainty in the value of x.
With uncertainty dx, Accuracy[x] is -Log[10, dx].
For exact numbers such as integers, Accuracy[x] is Infinity.
Accuracy[x] does not normally yield an integer result, and need not be positive.
For machine-precision numbers, Accuracy[x] gives the same as $MachinePrecision - Log[10, Abs[x]].
Accuracy[0.] is Log[10, $MinMachineNumber].
Numbers entered in the form digits``a are taken to have accuracy a.
If x is not a number, Accuracy[x] gives the minimum value of Accuracy for all the numbers that appear in x.
See Section 3.1.4.
See also: Precision, N, Chop, SetAccuracy.
New in Version 1; modified in 5.0.