DropoutLayer

DropoutLayer[]

represents a net layer that sets its input elements to zero with probability 0.5 during training.

DropoutLayer[p]

sets its input elements to zero with probability p during training.

Details and Options

  • DropoutLayer is commonly used as a form of neural network regularization.
  • DropoutLayer is typically used inside NetChain, NetGraph, etc.
  • The following optional parameters can be included:
  • Method "Dropout"dropout method to use
    "OutputPorts" "Output"output ports
  • Possible explicit settings for the Method option include:
  • "AlphaDropout"keeps the mean and variance of the input constant; designed to be used together with the ElementwiseLayer["SELU"] activation
    "Dropout"sets the input elements to zero with probability p during training, multiplying the remainder by 1/(1-p)
  • Possible settings for the "OutputPorts" option include:
  • "BinaryMask"binary mask applied to input data
    "Output"output of the dropout
    {port1,}a list of valid ports
  • DropoutLayer exposes the following ports for use in NetGraph etc.:
  • "Input"an array or sequence of arrays of arbitrary rank
    "Output"an array or sequence of arrays of arbitrary rank
  • DropoutLayer normally infers the dimensions of its input from its context in NetChain etc. To specify the dimensions explicitly as {n1,n2,}, use DropoutLayer["Input"->{n1,n2,}].
  • DropoutLayer[][input] explicitly computes the output from applying the layer.
  • DropoutLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
  • When given a NumericArray as input, the output will be a NumericArray.
  • DropoutLayer only randomly sets input elements to zero during training. During evaluation, DropoutLayer leaves the input unchanged, unless NetEvaluationMode->"Train" is specified when applying the layer.
  • Options[DropoutLayer] gives the list of default options to construct the layer. Options[DropoutLayer[]] gives the list of default options to evaluate the layer on some data.
  • Information[DropoutLayer[]] gives a report about the layer.
  • Information[DropoutLayer[],prop] gives the value of the property prop of DropoutLayer[]. Possible properties are the same as for NetGraph.

Examples

open allclose all

Basic Examples  (1)

Create a DropoutLayer:

Apply it to an input, which remains unchanged:

Use NetEvaluationMode to force training behavior of DropoutLayer:

Scope  (2)

Create a DropoutLayer with a specific probability:

Apply it to input data, which leaves the input unchanged:

Apply it to input data, specifying that training behavior be used:

Create a DropoutLayer that takes an RGB image and returns an RGB image:

The DropoutLayer acts on the image represented by a rank-3 array by randomly and independently zeroing the individual color components of each pixel:

Options  (2)

Method  (1)

Create a DropoutLayer using "AlphaDropout" as the dropout method:

Apply it to input data, specifying that training behavior be used:

If the input data has a mean of 0 and a variance of 1, then the output will have the same mean and variance:

This is not the case for the standard dropout method:

"OutputPorts"  (1)

Create a DropoutLayer that yields the binary mask besides the output:

Apply it to input data:

Apply it to input data, specifying that training behavior be used:

Properties & Relations  (1)

DropoutLayer can be used between recurrent layers in a NetChain to perform regularization. A typical network used to classify sentences might incorporate a DropoutLayer as follows:

More sophisticated forms of dropout are possible by using the "Dropout" option of recurrent layers directly:

Possible Issues  (1)

By default, any randomness invoked by NetEvaluationMode->"Train" is not affected by SeedRandom and BlockRandom:

Use option RandomSeedingInherited to change this behavior:

Use option RandomSeeding to control the randomness:

Wolfram Research (2016), DropoutLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/DropoutLayer.html (updated 2020).

Text

Wolfram Research (2016), DropoutLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/DropoutLayer.html (updated 2020).

CMS

Wolfram Language. 2016. "DropoutLayer." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2020. https://reference.wolfram.com/language/ref/DropoutLayer.html.

APA

Wolfram Language. (2016). DropoutLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/DropoutLayer.html

BibTeX

@misc{reference.wolfram_2023_dropoutlayer, author="Wolfram Research", title="{DropoutLayer}", year="2020", howpublished="\url{https://reference.wolfram.com/language/ref/DropoutLayer.html}", note=[Accessed: 19-March-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_dropoutlayer, organization={Wolfram Research}, title={DropoutLayer}, year={2020}, url={https://reference.wolfram.com/language/ref/DropoutLayer.html}, note=[Accessed: 19-March-2024 ]}