MeanSquaredLossLayer
represents a loss layer that computes the mean squared loss between its "Input" port and "Target" port.
Details and Options
- MeanSquaredLossLayer exposes the following ports for use in NetGraph etc.:
-
"Input" an array of arbitrary rank "Target" an array of the same rank as "Input" "Loss" a real number - MeanSquaredLossLayer[…][<"Input"->in,"Target"target >] explicitly computes the output from applying the layer.
- MeanSquaredLossLayer[…][<"Input"->{in1,in2,…},"Target"->{target1,target2,…} >] explicitly computes outputs for each of the ini and targeti.
- When given a NumericArray as input, the output will be a NumericArray.
- MeanSquaredLossLayer is typically used inside NetGraph to construct a training network.
- A MeanSquaredLossLayer[…] can be provided as the third argument to NetTrain when training a specific network.
- When appropriate, MeanSquaredLossLayer is automatically used by NetTrain if an explicit loss specification is not provided.
- MeanSquaredLossLayer["port"->shape] allows the shape of the given input "port" to be specified. Possible forms for shape include:
-
"Real" a single real number n a vector of length n {n1,n2,…} an array of dimensions n1×n2×… "Varying" a vector whose length is variable {"Varying",n2,n3,…} an array whose first dimension is variable and whose remaining dimensions are n2×n3×… NetEncoder[…] an encoder NetEncoder[{…,"Dimensions"{n1,…}}] an encoder mapped over an array of dimensions n1×… - Options[MeanSquaredLossLayer] gives the list of default options to construct the layer. Options[MeanSquaredLossLayer[…]] gives the list of default options to evaluate the layer on some data.
- Information[MeanSquaredLossLayer[…]] gives a report about the layer.
- Information[MeanSquaredLossLayer[…],prop] gives the value of the property prop of MeanSquaredLossLayer[…]. Possible properties are the same as for NetGraph.
Examples
open allclose allBasic Examples (3)
Create a MeanSquaredLossLayer:
Create a MeanSquaredLossLayer that takes length-3 vectors:
Create a NetGraph containing a MeanSquaredLossLayer:
Scope (4)
Create a MeanSquaredLossLayer:
Apply the MeanSquaredLossLayer to a pair of matrices:
Apply the MeanSquaredLossLayer to a pair of vectors:
Apply the MeanSquaredLossLayer to a pair of numbers:
Create a MeanSquaredLossLayer that assumes the input data are vectors of length 2:
Thread the layer across a batch of inputs:
Create a MeanSquaredLossLayer that takes two variable-length vectors:
Apply the layer to an input and target vector:
Thread the layer over a batch of input and target vectors:
Create a MeanSquaredLossLayer that takes two images as input:
Applications (1)
Define a single-layer neural network that takes in scalar numeric values and produces scalar numeric values, and train this network using a MeanSquaredLossLayer:
Properties & Relations (2)
MeanSquaredLossLayer computes:
Compare the output of the layer and the definition on an example:
MeanSquaredLossLayer effectively computes a normalized version of SquaredEuclideanDistance:
Text
Wolfram Research (2016), MeanSquaredLossLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/MeanSquaredLossLayer.html (updated 2019).
CMS
Wolfram Language. 2016. "MeanSquaredLossLayer." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2019. https://reference.wolfram.com/language/ref/MeanSquaredLossLayer.html.
APA
Wolfram Language. (2016). MeanSquaredLossLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/MeanSquaredLossLayer.html