ElementwiseLayer
represents a net layer that applies a unary function f to every element of the input array.
ElementwiseLayer["name"]
applies the function specified by "name".
Details and Options


- The function f can be any one of the following: Ramp, LogisticSigmoid, Tanh, ArcTan, ArcTanh, Sin, Sinh, ArcSin, ArcSinh, Cos, Cosh, ArcCos, ArcCosh, Log, Exp, Sqrt, Abs, Gamma, LogGamma.
- In general, f can be any object that when applied to a single argument gives any combination of Ramp, LogisticSigmoid, etc., together with Plus, Subtract, Times, Divide, Power, Min, Max, Clip, and numbers.
- ElementwiseLayer supports the following values for "name":
-
"RectifiedLinearUnit" or "ReLU" Ramp[x] "ExponentialLinearUnit" or "ELU" x when x≥0 Exp[x]-1 when x<0 "ScaledExponentialLinearUnit" or "SELU" 1.0507 x when x >= 0 1.7581 (Exp[x]-1) when x<0 "SoftSign" x/(1+Abs[x]) "SoftPlus" Log[Exp[x+1]] "HardTanh" Clip[x,{-1,1}] "HardSigmoid" Clip[(x+1)/2, {0,1}] "Sigmoid" LogisticSigmoid[x] - ElementwiseLayer[…][input] explicitly computes the output from applying the layer.
- ElementwiseLayer[…][{input1,input2,…}] explicitly computes outputs for each of the inputi.
- When given a NumericArray as input, the output will be a NumericArray.
- ElementwiseLayer is typically used inside NetChain, NetGraph, etc.
- ElementwiseLayer exposes the following ports for use in NetGraph etc.:
-
"Input" an array of arbitrary rank "Output" an array with the same dimensions as the input - When it cannot be inferred from other layers in a larger net, the option "Input"->{n1,n2,…} can be used to fix the input dimensions of ElementwiseLayer.
Examples
open all close allBasic Examples (2)
Create an ElementwiseLayer that computes Tanh of each input element:
Create an ElementwiseLayer that multiplies its input by a fixed constant:
Scope (3)
Applications (3)
Properties & Relations (1)
Possible Issues (3)
Introduced in 2016
Updated in 2017
(11.0)
|
(11.2)