DropoutLayer

DropoutLayer[]

represents a net layer that sets its input elements to zero with probability 0.5 during training, multiplying the remainder by 2.

DropoutLayer[p]

sets its input elements to zero with probability p during training, multiplying the remainder by 1/(1-p).

Details and Options

  • DropoutLayer[][input] explicitly computes the output from applying the layer.
  • DropoutLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
  • DropoutLayer is typically used inside NetChain, NetGraph, etc.
  • DropoutLayer exposes the following ports for use in NetGraph etc.:
  • "Input"a numerical tensor or sequence of tensors of arbitrary rank
    "Output"a numerical tensor or sequence of tensors of arbitrary rank
  • DropoutLayer normally infers the dimensions of its input from its context in NetChain etc. To specify the dimensions explicitly as {n1,n2,}, use DropoutLayer["Input"->{n1,n2,}].
  • DropoutLayer only randomly sets input elements to zero during training. During evaluation, DropoutLayer leaves the input unchanged, unless NetEvaluationMode->"Train" is specified when applying the layer.
  • DropoutLayer is commonly used as a form of neural network regularization.

Examples

open allclose all

Basic Examples  (2)

Create a DropoutLayer:

In[1]:=
Click for copyable input
Out[1]=

Create a DropoutLayer and apply it to an input, which remains unchanged:

In[1]:=
Click for copyable input
Out[1]=
In[2]:=
Click for copyable input
Out[2]=

Use NetEvaluationMode to force training behavior of DropoutLayer:

In[3]:=
Click for copyable input
Out[3]=

Scope  (2)

Properties & Relations  (1)

Possible Issues  (1)

See Also

BatchNormalizationLayer  NetEvaluationMode  NetChain  NetGraph  NetTrain

Introduced in 2016
(11.0)
| Updated in 2017
(11.1)