gives the effective number of digits to the right of the decimal point in the number x.
- Accuracy[x] gives a measure of the absolute uncertainty in the value of x.
- With uncertainty dx, Accuracy[x] is -Log[10,dx].
- For exact numbers such as integers, Accuracy[x] is Infinity.
- Accuracy[x] does not normally yield an integer result, and need not be positive.
- For any approximate number x, Accuracy[x] is equal to Precision[x]-RealExponent[x].
- For machine‐precision numbers whose magnitude is at least $MinMachineNumber, Accuracy[x] is given by $MachinePrecision-Log[10,Abs[x]]. »
- Accuracy[0.] equals Accuracy[$MinMachineNumber], as does the accuracy of any machine number smaller in magnitude than $MinMachineNumber. »
- Numbers entered in the form digits``a are taken to have accuracy a.
- If x is not a number, Accuracy[x] gives the minimum value of Accuracy for all the numbers that appear in x. »
Examplesopen allclose all
Accuracy is the effective number of digits known to the right of the decimal point:
Specify accuracy as the goal for N:
Generalizations & Extensions (1)
Properties & Relations (3)
No machine number has a higher accuracy than $MinMachineNumber:
For machine numbers, accuracy generally increases with decreasing magnitude, with a maximum at $MinMachineNumber: