ComputeUncertainty

ComputeUncertainty

is an option for ClassifierMeasurements, LearnedDistribution and other functions to specify if numeric results should be returned along with their uncertainty.

Details

  • Uncertainties are given using Around.
  • The uncertainty interval generated typically corresponds to one standard deviation.

Examples

Basic Examples  (2)

Create and test a classifier using ClassifierMeasurements:

In[1]:=
Click for copyable input
Out[1]=
In[2]:=
Click for copyable input
Out[2]=

Measure the accuracy along with its uncertainty:

In[3]:=
Click for copyable input
Out[3]=

Measure the F1 scores along with their uncertainties:

In[4]:=
Click for copyable input
Out[4]=

Train a "Multinormal" distribution on a nominal dataset:

In[1]:=
Click for copyable input
Out[1]=

Because of the necessary preprocessing, the PDF computation is not exact:

In[2]:=
Click for copyable input
Out[2]=
In[3]:=
Click for copyable input
Out[3]=

Use ComputeUncertainty to obtain the uncertainty on the result:

In[4]:=
Click for copyable input
Out[4]=

Increase MaxIterations to improve the estimation precision:

In[5]:=
Click for copyable input
Out[5]=
Introduced in 2019
(12.0)