"NSFWImage" (Built-in Classifier)

Determine whether an image is considered not safe for work.

Classes

Details

Examples

open allclose all

Basic Examples  (2)

Determine whether the input image contains pornographic contents (not safe):

Obtain the probabilities for different classes:

Obtain a ClassifierFunction for this classifier:

Use the classifier function to classify an example:

Classify many examples at once:

Scope  (1)

Load the ClassifierFunction corresponding to the built-in classifier:

Obtain the possible classes:

Options  (2)

ClassPriors  (1)

Use a custom ClassPriors for the possible outputs:

IndeterminateThreshold  (1)

Use a custom IndeterminateThreshold: