generates a ClassifierFunction[] based on the examples and classes given.


also generates a ClassifierFunction[] based on the examples and classes given.


generates a ClassifierFunction[] based on an association of classes with their examples.


attempts to classify data using a classifier function deduced from the training set given.


attempts to classify data using the built-in classifier function represented by "name".


gives the specified property of the classification associated with data.


takes an existing classifier function and modifies it with the new options given.

Details and Options

  • Classify can be used on many types of data, including numerical, textual, sounds, and images, as well as combinations of these.
  • Each examplei can be a single data element, a list of data elements, an association of data elements, or a Dataset object. In Classify[training,], training can be a Dataset object.
  • Classify[training] returns a ClassifierFunction[] that can then be applied to specific data.
  • In Classify[,data], data can be a single item or a list of items.
  • In Classify[,data,prop], properties are as given in ClassifierFunction[]; they include:
  • "Decision"best class according to probabilities and utility function
    "TopProbabilities"probabilities for most likely classes
    "TopProbabilities"nprobabilities for the n most likely classes
    "Probability"classprobability for a specific class
    "Probabilities"association of probabilities for all possible classes
    "Properties"list of all properties available
  • Examples of built-in classifier functions include:
  • "CountryFlag"which country a flag image is for
    "FacebookTopic"which topic a Facebook post is about
    "FacialAge"estimated age from a face
    "FacialExpression"what type of expression a face displays
    "FacialGender"what gender a face appears to be
    "Language"which natural language text is in
    "NameGender"which gender a first name is
    "NotablePerson"what notable person an image is of
    "NSFWImage"whether an image is considered "not safe for work"
    "Profanity"whether text contains profanity
    "ProgrammingLanguage"which programming language text is in
    "Sentiment"sentiment of a social media post
    "Spam"whether email is spam
  • The following options can be given:
  • ClassPriorsAutomaticexplicit prior probabilities for classes
    FeatureExtractorIdentityhow to extract features from which to learn
    FeatureNamesAutomaticfeature names to assign for input data
    FeatureTypesAutomaticfeature types to assume for input data
    IndeterminateThreshold0below what probability to return Indeterminate
    MethodAutomaticwhich classification algorithm to use
    PerformanceGoalAutomaticaspects of performance to try to optimize
    RandomSeeding1234what seeding of pseudorandom generators should be done internally
    TargetDevice"CPU"the target device on which to perform training
    TimeGoalAutomatichow long to spend training the classifier
    TrainingProgressReportingAutomatichow to report progress during training
    UtilityFunctionAutomaticutility as function of actual and predicted class
    ValidationSetAutomaticthe set of data on which to evaluate the model during training
  • Possible settings for PerformanceGoal include:
  • "DirectTraining"train directly on the full dataset, without model searching
    "Memory"minimize storage requirements of the classifier
    "Quality"maximize accuracy of the classifier
    "Speed"maximize speed of the classifier
    "TrainingSpeed"minimize time spent producing the classifier
    Automaticautomatic tradeoff among speed, accuracy, and memory
    {goal1,goal2,}automatically combine goal1, goal2, etc.
  • Possible settings for Method include:
  • "DecisionTree"classify using a decision tree
    "GradientBoostedTrees"classify using an ensemble of trees trained with gradient boosting
    "LogisticRegression"classify using probabilities from linear combinations of features
    "Markov"classify using a Markov model on the sequence of features (only for text, bag of token, etc.)
    "NaiveBayes"classify by assuming probabilistic independence of features
    "NearestNeighbors"classify from nearest neighbor examples
    "NeuralNetwork"classify using an artificial neural network
    "RandomForest"classify using BreimanCutler ensembles of decision trees
    "SupportVectorMachine"classify using a support vector machine
  • The following settings for TrainingProgressReporting can be used:
  • "Panel"show a dynamically updating graphical panel
    "Print"periodically report information using Print
    "ProgressIndicator"show a simple ProgressIndicator
    "SimplePanel"dynamically updating panel without learning curves
    Nonedo not report any information
  • Possible settings for RandomSeeding include:
  • Automaticautomatically reseed every time the function is called
    Inheriteduse externally seeded random numbers
    seeduse an explicit integer or strings as a seed
  • Classify[{assoc1,assoc2,}"key",] can be used to specify that the class is given by the value of "key" in each association associ.
  • Classify[{list1,list2,}n,] can be used to specify that the class is given by the value of part n in each list listi.
  • Classify[Dataset[]part,] can be used to specify that classes are given by the value of part of each row of the dataset.
  • Classify[net] can be used to convert a NetChain or NetGraph representing a classifier into a ClassifierFunction[].
  • Classify[,FeatureExtractor"Minimal"] indicates that the internal preprocessing should be as simple as possible.
  • In Classify[ClassifierFunction[],FeatureExtractorfe], the FeatureExtractorFunction[] fe will be prepended to the existing feature extractor.
  • Information can be used on the ClassifierFunction[] obtained.


open all close all

Basic Examples  (2)

Train a classifier function on labeled examples:

Click for copyable input
Click for copyable input

Use the classifier function to classify a new unlabeled example:

Click for copyable input

Obtain classification probabilities for this example:

Click for copyable input

Classify multiple examples:

Click for copyable input

Plot the probability that the class of an example is "A" as a function of the feature:

Click for copyable input

The training and the classification can be performed in one step:

Click for copyable input
Click for copyable input

Train a classifier with multiple features:

Click for copyable input

Classify a new example:

Click for copyable input

Classify an example that has missing features:

Click for copyable input

Get the probabilities for the most probable classes:

Click for copyable input

Scope  (16)

Options  (19)

Applications  (7)

Possible Issues  (1)

Neat Examples  (2)

Introduced in 2014
Updated in 2018