LocalResponseNormalizationLayer

LocalResponseNormalizationLayer[]

represents a net layer that normalizes its input by averaging across neighboring input channels.

Details and Options

  • The following optional parameters can be included:
  • "Alpha"1.0scaling parameter
    "Beta"0.5power parameter
    "Bias"1.0bias parameter
    "ChannelWindowSize"2number of channels to average over
  • LocalResponseNormalizationLayer[][input] explicitly computes the output from applying the layer.
  • LocalResponseNormalizationLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
  • When given a NumericArray as input, the output will be a NumericArray.
  • LocalResponseNormalizationLayer is typically used inside NetChain, NetGraph, etc.
  • LocalResponseNormalizationLayer exposes the following ports for use in NetGraph etc.:
  • "Input"a rank-3 array
    "Output"a rank-3 array
  • When it cannot be inferred from other layers in a larger net, the option "Input"->{n1,n2,n3} can be used to fix the input dimensions of LocalResponseNormalizationLayer.
  • The output array is obtained via , where D_(ijk)=((alpha)/(2c+1) sum_( x = max(1, i-c))^( min(N, c+i))T_(xjk)^2+b)^(beta), Tijk is the input, is the "Alpha" parameter, is the "Beta" parameter, is the "ChannelWindowSize" and N is the number of channels in the input Tijk.
  • Options[LocalResponseNormalizationLayer] gives the list of default options to construct the layer. Options[LocalResponseNormalizationLayer[]] gives the list of default options to evaluate the layer on some data.
  • Information[LocalResponseNormalizationLayer[]] gives a report about the layer.
  • Information[LocalResponseNormalizationLayer[],prop] gives the value of the property prop of LocalResponseNormalizationLayer[]. Possible properties are the same as for NetGraph.

Examples

open allclose all

Basic Examples  (2)

Create a LocalResponseNormalizationLayer:

Create a LocalResponseNormalizationLayer with input dimensions specified:

Apply the layer to an input:

Scope  (1)

Ports  (1)

Create a LocalResponseNormalizationLayer with input dimensions specified:

Apply the layer to an input:

Thread the layer across a batch of inputs:

Possible Issues  (1)

The number of input channels must be larger than 2×(channel window size)+ 1:

Wolfram Research (2017), LocalResponseNormalizationLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html.

Text

Wolfram Research (2017), LocalResponseNormalizationLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html.

CMS

Wolfram Language. 2017. "LocalResponseNormalizationLayer." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html.

APA

Wolfram Language. (2017). LocalResponseNormalizationLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html

BibTeX

@misc{reference.wolfram_2024_localresponsenormalizationlayer, author="Wolfram Research", title="{LocalResponseNormalizationLayer}", year="2017", howpublished="\url{https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html}", note=[Accessed: 21-November-2024 ]}

BibLaTeX

@online{reference.wolfram_2024_localresponsenormalizationlayer, organization={Wolfram Research}, title={LocalResponseNormalizationLayer}, year={2017}, url={https://reference.wolfram.com/language/ref/LocalResponseNormalizationLayer.html}, note=[Accessed: 21-November-2024 ]}