"NSFWImage" (Built-in Classifier)
Determine whether an image is considered not safe for work.
Classes

Details

- This classifier is based on the net model Yahoo Open NSFW Model V1.
- This classifier may download resources that will be stored in your local object store at $LocalBase, and that can be listed using LocalObjects[] and removed using ResourceRemove.
Examples
open allclose allBasic Examples (2)
Determine whether the input image contains pornographic contents (not safe):
Obtain the probabilities for different classes:
Obtain a ClassifierFunction for this classifier:
Scope (1)
Load the ClassifierFunction corresponding to the built-in classifier:
Options (2)
ClassPriors (1)
Use a custom ClassPriors for the possible outputs:
IndeterminateThreshold (1)
Use a custom IndeterminateThreshold: