represents a net layer that applies a unary function f to every element of the input array.


applies the function specified by "name".

Details and Options


open allclose all

Basic Examples  (2)

Create an ElementwiseLayer that computes Tanh of each input element:

Create an ElementwiseLayer that multiplies its input by a fixed constant:

Apply the layer to an input vector:

Scope  (3)

Create a ElementwiseLayer that computes a "hard sigmoid":

Apply the layer to a batch of inputs:

Plot the behavior of the layer:

Create an ElementwiseLayer that takes vectors of size 3:

Apply the layer to a single vector:

When applied, the layer will automatically thread over a batch of vectors:

Create an ElementwiseLayer that computes a Gaussian:

Apply the layer to a range of values and plot the result:

Applications  (3)

Train a 16-layer-deep, self-normalizing network described in "Self-Normalizing Neural Networks", G. Klambauer et. al., 2017, on the UCI Letter dataset. Obtain the data:

Self-normalizing nets assume that the input data has mean of 0 and variance of 1. Standardize the test and training data:

Verify that the sample mean and variance of the training data are 0 and 1, respectively:

Define a 16-layer, self-normalizing net with "AlphaDropout":

Train the net:

Obtain the accuracy:

Compare the accuracy against the "RandomForest" method in Classify:

Allow a classification network to deal with a "non-separable" problem. Create a synthetic training set consisting of points on a disk, separated into two classes by the circle r=0.5:

Create a layer composed of two LinearLayer layers, and a final transformation into a probability using an ElementwiseLayer:

Train the network on the data:

The net was not able to separate the two classes:

Because LinearLayer is an affine layer, stacking the two layers without an intervening nonlinearity is equivalent to using a single layer. A single line in the plane cannot separate the two classes, which is the level set of a single LinearLayer.

Train a similar net that has a Tanh nonlinearity between the two layers:

The net can now separate the classes:

Binary classification tasks require that the output of a net be a probability. ElementwiseLayer[LogisticSigmoid] can be used to take an arbitrary scalar and produce a value between 0 and 1. Create a net that takes a vector of length 2 and produces a binary prediction:

Train the net to decide if the first number in the vector is greater than the second:

Evaluate the net on several inputs:

The underlying output is a probability, which can be seen by disabling the "Boolean" decoder:

Properties & Relations  (1)

ElementwiseLayer is automatically used when an appropriate function is specified in a NetChain or NetGraph:

Possible Issues  (3)

ElementwiseLayer cannot accept symbolic inputs:

Certain choices of f can produce failures for inputs outside their domain:

Certain functions are not supported directly:

Approximate the Zeta function using a rational function:

The approximation is good over the range (-10,10):

Construct an ElementwiseLayer using the approximation:

Measure the error of the approximation on specific inputs:

The network will fail when evaluated at a pole:

Wolfram Research (2016), ElementwiseLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/ElementwiseLayer.html (updated 2022).


Wolfram Research (2016), ElementwiseLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/ElementwiseLayer.html (updated 2022).


Wolfram Language. 2016. "ElementwiseLayer." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2022. https://reference.wolfram.com/language/ref/ElementwiseLayer.html.


Wolfram Language. (2016). ElementwiseLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/ElementwiseLayer.html


@misc{reference.wolfram_2024_elementwiselayer, author="Wolfram Research", title="{ElementwiseLayer}", year="2022", howpublished="\url{https://reference.wolfram.com/language/ref/ElementwiseLayer.html}", note=[Accessed: 24-May-2024 ]}


@online{reference.wolfram_2024_elementwiselayer, organization={Wolfram Research}, title={ElementwiseLayer}, year={2022}, url={https://reference.wolfram.com/language/ref/ElementwiseLayer.html}, note=[Accessed: 24-May-2024 ]}