PReLULayer - Parametrized Rectified Linear Unit (PReLU) layer - MATLAB (original) (raw)

Main Content

Parametrized Rectified Linear Unit (PReLU) layer

Since R2024a

Description

A PReLU layer performs a threshold operation, where for each channel, any input value less than zero is multiplied by a scalar learned at training time.

This operation is equivalent to:

Creation

Syntax

Description

`layer` = preluLayer returns a parametrized ReLU layer.

`layer` = preluLayer(`Name=Value`) returns a parametrized ReLU layer and sets the optional Name and Alpha properties. For example, preluLayer(Alpha=2,Name="prelu1") creates a PReLU layer with the optional Alpha and Name properties.

example

Properties

expand all

PReLU

Learnable multiplier for negative input values, specified either as a numeric scalar, or vector, or a matrix. The size of Alpha must be compatible with the input size of the PReLU layer. If the sizes ofAlpha and the input of the PReLU layer are compatible, then the two arrays implicitly expand to match each other. For example, ifAlpha is a scalar, then the scalar is combined with each element of the other array. Also, vectors with different orientations (one row vector and one column vector) implicitly expand to form a matrix.

The network learns the parameter Alpha during training.

Example: 0.4

Layer

Data Types: char | string

This property is read-only.

Number of inputs to the layer, stored as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, stored as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, stored as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, stored as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a PReLU layer with the name "prelu1".

layer = preluLayer(Name="prelu1")

layer = PReLULayer with properties:

 Name: 'prelu1'

Learnable Parameters Alpha: 0.2500

Include a PReLU layer in a Layer array.

layers = [ imageInputLayer([28 28 1]) convolution2dLayer(3,16) batchNormalizationLayer preluLayer

maxPooling2dLayer(2,Stride=2)
convolution2dLayer(3,32)
batchNormalizationLayer
preluLayer

fullyConnectedLayer(10)
softmaxLayer]

layers = 10×1 Layer array with layers:

 1   ''   Image Input           28×28×1 images with 'zerocenter' normalization
 2   ''   2-D Convolution       16 3×3 convolutions with stride [1  1] and padding [0  0  0  0]
 3   ''   Batch Normalization   Batch normalization
 4   ''   PReLU                 PReLU
 5   ''   2-D Max Pooling       2×2 max pooling with stride [2  2] and padding [0  0  0  0]
 6   ''   2-D Convolution       32 3×3 convolutions with stride [1  1] and padding [0  0  0  0]
 7   ''   Batch Normalization   Batch normalization
 8   ''   PReLU                 PReLU
 9   ''   Fully Connected       10 fully connected layer
10   ''   Softmax               softmax

Algorithms

expand all

Layers in a layer array or layer graph pass data to subsequent layers as formatted dlarray objects. The format of a dlarray object is a string of characters in which each character describes the corresponding dimension of the data. The format consists of one or more of these characters:

For example, you can describe 2-D image data that is represented as a 4-D array, where the first two dimensions correspond to the spatial dimensions of the images, the third dimension corresponds to the channels of the images, and the fourth dimension corresponds to the batch dimension, as having the format "SSCB" (spatial, spatial, channel, batch).

PReLULayer objects apply an element-wise operation and support input data of any format. The layer does not add or remove any dimensions, so it outputs data with the same format as its input data.

References

[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In_Proc. ICML_, vol. 30, no. 1. 2013.

Extended Capabilities

Version History

Introduced in R2024a