MeanSquaredLossLayer
represents a loss layer that computes the mean squared loss between its "Input" port and "Target" port.
Details and Options


- MeanSquaredLossLayer exposes the following ports for use in NetGraph etc.:
-
"Input" an array of arbitrary rank "Target" an array of the same rank as "Input" "Loss" a real number - MeanSquaredLossLayer[…][<"Input"->in,"Target"target >] explicitly computes the output from applying the layer.
- MeanSquaredLossLayer[…][<"Input"->{in1,in2,…},"Target"->{target1,target2,…} >] explicitly computes outputs for each of the ini and targeti.
- When given a NumericArray as input, the output will be a NumericArray.
- MeanSquaredLossLayer is typically used inside NetGraph to construct a training network.
- A MeanSquaredLossLayer[…] can be provided as the third argument to NetTrain when training a specific network.
- When appropriate, MeanSquaredLossLayer is automatically used by NetTrain if an explicit loss specification is not provided.
- MeanSquaredLossLayer["port"->shape] allows the shape of the given input "port" to be specified. Possible forms for shape include:
-
"Real" a single real number n a vector of length n {n1,n2,…} an array of dimensions n1×n2×… "Varying" a vector whose length is variable {"Varying",n2,n3,…} an array whose first dimension is variable and whose remaining dimensions are n2×n3×… NetEncoder[…] an encoder NetEncoder[{…,"Dimensions"{n1,…}}] an encoder mapped over an array of dimensions n1×…
Examples
open all close allBasic Examples (3)
Create a MeanSquaredLossLayer:
Create a MeanSquaredLossLayer that takes length-3 vectors:
Create a NetGraph containing a MeanSquaredLossLayer:
Scope (4)
Applications (1)
Properties & Relations (2)
Introduced in 2016
Updated in 2019
(11.0)
|
(12.0)