ThreadingLayer

ThreadingLayer[f]

represents a net layer that takes several input arrays and applies a function f to corresponding array elements.

ThreadingLayer[f,bspec]

allows array shapes to be conformed according to broadcasting specification bspec.

Details and Options

  • ThreadingLayer is typically used inside NetGraph.
  • In ThreadingLayer[f], the function f can be any one of the following: Plus, Subtract, Times, Divide, Power, Surd, Min, Max, Clip, Mod, ArcTan, Erf.
  • In general, f can be any object that when applied to two arguments gives any combination of Plus, Subtract, etc. together with numbers; all functions supported by ElementwiseLayer; and some logical operations using If, And, Or, Which, Piecewise, Equal, Greater, GreaterEqual, Less, LessEqual, Unequal.
  • The function f can also contain expressions of the form "name"[input], where name is one of the named functions ("ELU", "SELU", "SoftSign", etc.) accepted by ElementwiseLayer.
  • In ThreadingLayer[f,bspec], smaller arrays are implicitly replicated before applying f to match dimensions of larger arrays. The following settings can be given for bspec:
  • Noneno replication
    Automaticreplicate along axis of size 1 (default)
    ninsert dimension(s) with the right size at level n if needed
    -ncount from the end
    {spec1,spec2,}use specific broadcasting settings for the respective input arrays
  • Option InputPorts can be used to specify the number, names or shapes of input ports. If InputPorts is not specified, the number of ports and their shapes are inferred from NetChain or NetGraph connectivity.
  • Besides input ports, ThreadingLayer exposes the following port for use in NetGraph etc.:
  • "Output"an array
  • Within a NetGraph, a ThreadingLayer can be connected using a single edge of the form {src1,src2,}threadlayer, where threadlayer is the name or index of the ThreadingLayer, or as multiple separate edges given in the corresponding order, as src1threadlayer,src2threadlayer,,srcnthreadlayer.
  • If no names are explicitly given for input ports, ThreadingLayer[f,] exposes input ports named "Input1", "Input2", etc.
  • ThreadingLayer[Function[ #Name1 #Name2 ], ] exposes input ports named "Name1", "Name2", etc.
  • ThreadingLayer[f,None] forces all input arrays to have the same shape, and the output has this shape. ThreadingLayer[f,Automatic] forces all input arrays to have the same rank, and the respective shapes must be the same or equal to 1. ThreadingLayer[Plus,-1] automatically broadcasts inputs like Plus by inserting dimensions at the deepest level.
  • ThreadingLayer[][{in1,in2,}] explicitly computes the output from applying the layer, which is effectively given by f[in1,in2,].
  • When given NumericArray as inputs, the output will be a NumericArray.
  • Options[ThreadingLayer] gives the list of default options to construct the layer. Options[ThreadingLayer[]] gives the list of default options to evaluate the layer on some data.
  • Information[ThreadingLayer[]] gives a report about the layer.
  • Information[ThreadingLayer[],prop] gives the value of the property prop of ThreadingLayer[]. Possible properties are the same as for NetGraph.

Examples

open allclose all

Basic Examples  (6)

Create a ThreadingLayer using Times as the function to be threaded:

Apply the layer to a pair of inputs:

Create a ThreadingLayer that sums arrays with different dimensions, broadcasting the smallest arrays on the first dimensions:

Apply the layer to a pair of inputs:

Create a ThreadingLayer that sums arrays with different dimensions, broadcasting the smallest arrays on the last dimensions:

Apply the layer to a pair of inputs:

Use ThreadingLayer in a NetGraph:

Apply the net to two vectors:

Create a NetGraph that computes the outer sum of two arrays with ThreadingLayer:

Apply the outer sum to a pair of inputs:

The names of input ports can be specified using option "Inputs":

The names of input ports can be specified directly within the function of ThreadingLayer:

Apply the layer to an association of arrays:

Scope  (2)

Create a ThreadingLayer that takes a specific number of inputs:

Apply it to data:

Create a ThreadingLayer that uses a custom transformation to compute a spherical Gaussian:

Evaluate the layer on two input vectors to get a vector of outputs:

Plot the output of the layer:

Applications  (1)

Define a hinge loss using a ThreadingLayer:

Create a net that computes the hinge loss with a margin of 2:

When the target is within distance 2 of the input, the loss is zero:

The loss increases linearly beyond a distance of 2:

Plot the loss as a function of input for a fixed target of 2:

Perform linear regression using the hinge loss:

Plot the solution:

Compare the solutions obtained using the hinge loss, mean absolute and mean squared error:

Possible Issues  (3)

ThreadingLayer cannot accept symbolic inputs:

Certain choices of f can produce failures for inputs outside their domain:

Some pure functions are not supported:

Wolfram Research (2017), ThreadingLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/ThreadingLayer.html (updated 2021).

Text

Wolfram Research (2017), ThreadingLayer, Wolfram Language function, https://reference.wolfram.com/language/ref/ThreadingLayer.html (updated 2021).

CMS

Wolfram Language. 2017. "ThreadingLayer." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2021. https://reference.wolfram.com/language/ref/ThreadingLayer.html.

APA

Wolfram Language. (2017). ThreadingLayer. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/ThreadingLayer.html

BibTeX

@misc{reference.wolfram_2023_threadinglayer, author="Wolfram Research", title="{ThreadingLayer}", year="2021", howpublished="\url{https://reference.wolfram.com/language/ref/ThreadingLayer.html}", note=[Accessed: 18-April-2024 ]}

BibLaTeX

@online{reference.wolfram_2023_threadinglayer, organization={Wolfram Research}, title={ThreadingLayer}, year={2021}, url={https://reference.wolfram.com/language/ref/ThreadingLayer.html}, note=[Accessed: 18-April-2024 ]}