# SoftmaxLayer

represents a softmax net layer.

SoftmaxLayer[n]

represents a softmax net layer that uses level n as the normalization dimension.

# Details and Options

• SoftmaxLayer[][input] explicitly computes the output for input.
• SoftmaxLayer[][{input1,input2,}] explicitly computes outputs for each of the inputi.
• SoftmaxLayer is typically used inside NetChain, NetGraph, etc. to normalize the output of other layers in order to use them as class probabilities for classification tasks.
• SoftmaxLayer can operate on tensors that contain "Varying" dimensions.
• SoftmaxLayer exposes the following ports for use in NetGraph etc.:
•  "Input" a numerical tensor of dimensions d1×d2×…×dn "Output" a numerical tensor of dimensions d1×d2×…×dn
• When it cannot be inferred from other layers in a larger net, the option "Input"->n can be used to fix the input dimensions of SoftmaxLayer.
• is equivalent to SoftmaxLayer[-1].
• SoftmaxLayer effectively normalizes the exponential of the input tensor, producing vectors that sum to 1. For the default level of -1, the innermost dimension is used as the normalization dimension.
• When SoftmaxLayer[-1] is applied to a vector v, it produces the vector Normalize[Exp[v],Total]. When applied to a higher-rank tensor, it is mapped onto level -1.
• When SoftmaxLayer[m] is applied to a k-rank input tensor , it produces the tensor , where m is the summed-over index of x.
• SoftmaxLayer[,"Input"shape] allows the shape of the input to be specified. Possible forms for shape are:
•  n a vector of size n {d1,d2,…} a tensor of dimensions d1×d2×… {"Varying",d1,d2,…} varying number of tensors of dimensions d1×d2×…

# Examples

open allclose all

## Basic Examples(2)

Create a SoftmaxLayer:

 In[1]:=
 Out[1]=

Create a SoftmaxLayer that takes a vector of length 5 as input:

 In[1]:=
 Out[1]=

Apply the layer to an input vector:

 In[2]:=
 Out[2]=

The elements of the result sum to 1:

 In[3]:=
 Out[3]=