specifies a neural net defined by a graph in which the output of layer mi is given as input to layer ni.


specifies a net with explicitly named layers.


  • For a net with a single input port, NetGraph[][data] gives the result of applying the net to data.
  • For a net with multiple input ports, NetGraph[][<|port1->data1,|>] provides data to each port.
  • For a net with a single output port, NetGraph[][data] gives the output for that port.
  • For a net with multiple output ports, NetGraph[][data,oport] gives the output for the output port named oport. NetGraph[][data] or NetGraph[][data,All] gives an association of the outputs for all ports.
  • NetGraph[][data,NetPortGradient[iport]] gives the gradient of the output with respect to the value of the input port iport.
  • Normal[NetGraph[]] will return a list or association of the layers used to construct the graph.
  • If one or more input or output ports of any layers are left unconnected, these will become ports of the entire NetGraph.
  • If multiple output ports of layers are left unconnected and share the same name, they will become separate ports of the entire NetGraph with names "Port1", "Port2", etc.
  • Input or output ports for the entire NetGraph can be created by specifying NetPort["input"]-> or ->NetPort["output"] in the list of connections.
  • If the n^(th) layer has more than one input port or more than one output port, these can be disambiguated by writing NetPort[{n,"port"}] or NetPort[n,port].
  • If a layer has a port that accepts multiple inputs, such as CatenateLayer or SummationLayer, multiple connections can be made simultaneously by writing {m1,m2,}->n.
  • A linear chain of connections within the graph can be specified as layer1->layer2->->layern, which will cause each layeri to be connected to layeri+1.
  • When ambiguous, the tensor shapes of input and output ports of the entire graph can be specified with options of the form "port"shape. Valid shapes include:
  • "Real"a single real number
    "Integer"a single integer
    na vector of length n
    {n1,n2,}a tensor of dimensions n1×n2×
    "Varying"a variable-length vector
    {"Varying",n2,n3,}a variable-length sequence of tensors of dimensions n2×n3×
    NetEncoder[]an encoder (for input ports)
    NetDecoder[]a decoder (for output ports)
    "type"NetEncoder["type"] or NetDecoder["type"]
    {n,coder}an encoder or decoder mapped over a sequence of length n
  • NetGraph supports the following special layer specifications when giving individual layers:
  • Ramp,LogisticSigmoid,ElementwiseLayer[f]
  • The StandardForm of NetGraph shows the connectivity of layers in the graph and annotates edges with the dimensions of the tensor that the edge represents. Clicking a layer in the graph shows more information about that layer.
  • The TraditionalForm of NetGraph shows a more publication-appropriate depiction of the graph.
  • Take[NetGraph[],{start,end}] returns a subgraph of the given NetGraph that contains only the layers that connect start and end. The following forms can be given for start and end:
  • n,"layer"a specified layer
    NetPort[layer,"port"]the specified input or output port of a layer
    NetPort["port"]an input or output port of the entire graph
    Allall of the inputs or outputs of the graph
    {spec1,spec2,}the union of the speci
  • VertexDelete[NetGraph[],layer] deletes one or more layers from a NetGraph, returning a new graph. Layers such as ElementwiseLayer[] that have the same input and output size will be removed such that their output is connected directly to their input.
  • NetGraph[][data,opts] specifies that options should be used in applying the net to data. Possible options include:
  • NetEvaluationMode"Test"what mode to use in performing evaluation
    TargetDevice"CPU"the target device on which to perform evaluation
  • With the setting NetEvaluationMode->"Training", layers such as DropoutLayer will behave as they do for training rather than ordinary evaluation.
  • NetGraph[][[spec]] extracts the layer specified by spec from the net.


open allclose all

Basic Examples  (2)

Create a net with one layer:

Click for copyable input

Apply the net to an input:

Click for copyable input

Create a net with two layers:

Click for copyable input

Initialize all arrays in the net:

Click for copyable input

Apply the net to an input:

Click for copyable input

Scope  (9)

Applications  (1)

Properties & Relations  (4)

See Also

NetModel  NetPort  NetChain  NetInitialize  NetTrain  NetExtract  NetEncoder  NetDecoder  LinearLayer  ElementwiseLayer  ClassifierMeasurements

Introduced in 2016