"NaiveBayes" (Machine Learning Method)

Details & Suboptions

  • Naive Bayes is a classification technique based on Bayes's theorem which assumes that the features are independent given the class. The class probabilities for a given example are then: , where is the probability distribution of feature given the class, and is the prior probability of the class. Both distributions are estimated from the training data. In the current implementation, distributions are modeled using a piecewise-constant function (i.e a variable-width histogram).
  • The following suboption can be given
  • "SmoothingParameter" .2regularization parameter

Examples

open allclose all

Basic Examples  (2)

Train a classifier function on labeled examples:

Obtain information about the classifier:

Classify a new example:

Generate some normally distributed data:

Visualize it:

Train a classifier on this dataset:

Plot the training set and the probability distribution of each class as a function of the features:

Options  (2)

"SmoothingParameter"  (2)

Train a classifier using the "SmoothingParameter" suboption:

Train several classifiers on the "FisherIris" dataset by using different settings of the "SmoothingParameter" option:

Evaluate these classifiers on a data point that is unlike points from the training set and compare the class probability for class "setosa":