"NSFWImage" (Built-in Classifier)

Determine whether an image is considered not safe for work.

Classes

    Out[1]//Short=

Details

Examples

open allclose all

Basic Examples  (2)

Determine whether the input image contains pornographic contents (not safe):

In[1]:=
Click for copyable input
Out[1]=

Obtain the probabilities for different classes:

In[2]:=
Click for copyable input
Out[2]=

Obtain a ClassifierFunction for this classifier:

In[1]:=
Click for copyable input
Out[1]=

Use the classifier function to classify an example:

In[2]:=
Click for copyable input
Out[2]=

Classify many examples at once:

In[3]:=
Click for copyable input
Out[3]=

Scope  (1)

Options  (2)

See Also

Classify  NetModel

Related Models