"GaussianMixture" (Machine Learning Method)

Details & Suboptions

  • "GaussianMixture" models the probability density of a numeric space using a mixture of multivariate normal distribution.
  • Each Gaussian is defined by its mean and covariance matrix, as defined in the "Multinormal" method.
  • The following options can be given:
  • "CovarianceType"Automatictype of constraint on the covariance matrices
    "ComponentsNumber"Automaticnumber of Gaussians
    MaxIterations100maximum number of expectation-maximization iterations
  • Possible settings for "CovarianceType" include:
  • "Diagonal"only diagonal elements are learned (the others are set to 0)
    "Full"all elements are learned
    "FullShared"each Gaussian shares the same full covariance
    "Spherical"only diagonal elements are learned and are set to be equal
  • Except when "CovarianceType""FullShared", the covariance matrices can differ from each other.
  • Information[LearnedDistribution[],"MethodOption"] can be used to extract the values of options chosen by the automation system.
  • LearnDistribution[,FeatureExtractor"Minimal"] can be used to remove most preprocessing and directly access the method.

Examples

open all close all

Basic Examples  (3)

Train a Gaussian mixture distribution on a numeric dataset:

In[1]:=
Click for copyable input
Out[1]=

Look at the distribution Information:

In[2]:=
Click for copyable input
Out[2]=

Obtain options information:

In[3]:=
Click for copyable input
Out[3]=

Obtain an option value directly:

In[4]:=
Click for copyable input
Out[4]=

Compute the probability density for a new example:

In[5]:=
Click for copyable input
Out[5]=

Plot the PDF along with the training data:

In[6]:=
Click for copyable input
Out[6]=

Generate and visualize new samples:

In[7]:=
Click for copyable input
Out[7]=

Train a Gaussian mixture distribution on a two-dimensional dataset:

In[1]:=
Click for copyable input
In[2]:=
Click for copyable input
Out[2]=

Plot the PDF along with the training data:

In[3]:=
Click for copyable input
Out[3]=

Use SynthesizeMissingValues to impute missing values using the learned distribution:

In[4]:=
Click for copyable input
Out[4]=
In[5]:=
Click for copyable input
Out[5]=

Train a Gaussian mixture distribution on a nominal dataset:

In[1]:=
Click for copyable input
Out[1]=

Because of the necessary preprocessing, the PDF computation is not exact:

In[2]:=
Click for copyable input
Out[2]=
In[3]:=
Click for copyable input
Out[3]=

Use ComputeUncertainty to obtain the uncertainty on the result:

In[4]:=
Click for copyable input
Out[4]=

Increase MaxIterations to improve the estimation precision:

In[5]:=
Click for copyable input
Out[5]=

Options  (3)