ParametricRampLayer

ParametricRampLayer[]

represents a net layer that computes a leaky ReLU activation with a slope that can be learned.

ParametricRampLayer[levels]

specifies the levels on which each dimension has a specific slope.

Details and Options

Examples

open allclose all

Basic Examples  (1)

Scope  (3)

Initialize a ParametricRampLayer that takes a vector of length 3, having a slope coefficient for each dimension:

Get the values of the slopes:

Apply the layer to some inputs:

Learn the slopes on some data:

Initialize a ParametricRampLayer that takes a length-3 vector and uses a unique slope coefficient:

Initialize a ParametricRampLayer that takes a sequence of length-3 vectors:

Options  (3)

LearningRateMultipliers  (1)

Create a leaky ReLU with a fixed slope of 0.1:

Train a network with this nonlinearity:

The leaky ReLU still has the same slope value after training:

"Slope"  (2)

Create a ParametricRampLayer already initialized with a given slope value:

Apply the layer to some data:

Create a leaky ReLU with a unique slope, fixing the value of that slope:

Apply the layer to some data:

Introduced in 2020
 (12.1)