InstanceNormalizationLayer

InstanceNormalizationLayer[]

represents a trainable net layer that normalizes its input data across the spatial dimensions.

Details and Options

  • The following optional parameters can be included:
  • "Beta"Automaticlearnable bias parameters
    "Epsilon"0.001`stability parameter
    "Gamma"Automaticlearnable scaling parameters
  • With Automatic settings, gamma and beta are added automatically when NetInitialize or NetTrain is used.
  • If beta and gamma have been added, InstanceNormalizationLayer[][input] explicitly computes the output from applying the layer.
  • InstanceNormalizationLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
  • NetExtract can be used to extract beta, epsilon and gamma from an InstanceNormalizationLayer object.
  • InstanceNormalizationLayer is typically used inside NetChain, NetGraph, etc.
  • InstanceNormalizationLayer exposes the following ports for use in NetGraph etc.:
  • "Input"a numerical tensor of rank greater than 2
    "Output"a numerical tensor of rank greater than 2
  • When it cannot be inferred from other layers in a larger net, the option "Input"->{n1,n2,} can be used to fix the input dimensions of InstanceNormalizationLayer.
  • Consider an input tensor , where is the channel dimension and represents all the spatial dimensions flattened together. Then the output is obtained via , where the mean is given by , the variance by , is the "Beta" parameter, is the "Gamma" parameter, and is the total size of the spatial dimension.
  • The above definition of InstanceNormalizationLayer is based on the definition found in Ulyanov et al., Instance Normalization: The Missing Ingredient for Fast Stylization, 2016.

Examples

open allclose all

Basic Examples  (2)

Create an InstanceNormalizationLayer:

In[1]:=
Click for copyable input
Out[1]=

Create an initialized InstanceNormalizationLayer that takes a rank-3 tensor:

In[1]:=
Click for copyable input
Out[1]=

Apply the layer to an input:

In[2]:=
Click for copyable input
Out[2]=

Scope  (2)

Options  (3)

Possible Issues  (1)

See Also

BatchNormalizationLayer  ConvolutionLayer  LocalResponseNormalizationLayer  NetChain  NetGraph  NetInitialize  NetTrain  NetExtract

Introduced in 2017
(11.1)