represents a softmax net layer.

Details and Options

  • SoftmaxLayer[][input] explicitly computes the output for input.
  • SoftmaxLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
  • SoftmaxLayer is typically used inside NetChain, NetGraph, etc. to normalize the output of other layers in order to use them as class probabilities for classification tasks.
  • SoftmaxLayer exposes the following ports for use in NetGraph etc.:
  • "Input"a numerical tensor of dimensions d1×d2××dn
    "Output"a numerical tensor of dimensions d1×d2××dn
  • When it cannot be inferred from other layers in a larger net, the option "Input"->n can be used to fix the input dimensions of SoftmaxLayer.
  • SoftmaxLayer effectively normalizes the exponential of the input tensor, producing vectors that sum to 1. The innermost dimension is used as the normalization dimension. Explicitly, when SoftmaxLayer is given a k-rank input tensor x_(n_(1) ... n_(k)), it produces the tensor .
  • Equivalently, SoftmaxLayer computes Normalize[Exp[v],Total] when applied to a vector v, and is mapped onto level when applied to a tensor of higher rank.
  • SoftmaxLayer["Input"shape] allows the shape of the input to be specified. Possible forms for shape are:
  • na vector of size n
    {d1,d2,}a tensor of dimensions d1×d2×
    {"Varying",d1,d2,}varying number of tensors of dimensions d1×d2×


open allclose all

Basic Examples  (2)

Create a SoftmaxLayer:

Click for copyable input

Create a SoftmaxLayer that takes a vector of length 5 as input:

Click for copyable input

Apply the layer to an input vector:

Click for copyable input

The elements of the result sum to 1:

Click for copyable input

Scope  (4)

Properties & Relations  (3)

See Also

CrossEntropyLossLayer  LinearLayer  SummationLayer  TotalLayer  NetChain  NetGraph  NetTrain  NetDecoder

Introduced in 2016