Accuracy[x] gives the number of digits to the right of the decimal point in the number x.
If x is not a number, Accuracy[x] gives the minimum value of Accuracy for all the numbers that appear in x.
Accuracy gives Infinity when applied to exact numbers, such as integers.
Accuracy assumes a precision of $MachinePrecision when applied to machine-precision numbers.
Accuracy can yield a negative result.
See The Mathematica Book: Section 3.1.4.
See also: Precision, N, Chop, SetAccuracy.