relu - Apply rectified linear unit activation - MATLAB (original) (raw)

Main Content

Apply rectified linear unit activation

Syntax

Description

The rectified linear unit (ReLU) activation operation performs a nonlinear threshold operation, where any input value less than zero is set to zero.

This operation is equivalent to:

Note

This function applies the ReLU operation to dlarray data. If you want to apply the ReLU activation within a dlnetwork object, use reluLayer.

[Y](#mw%5Fe846e5fa-9f33-42d6-981e-70e8c6e2b228) = relu([X](#mw%5F828f6995-4c13-4c4b-81ec-542fd551a753%5Fsep%5Fmw%5F6f734774-5538-44ff-ad05-7e744f97893a)) computes the ReLU activation of the input X by applying a threshold operation. All values in X that are less than zero are set to zero.

example

Examples

collapse all

Create a formatted dlarray object containing a batch of 128 28-by-28 images with 3 channels. Specify the format 'SSCB' (spatial, spatial, channel, batch).

miniBatchSize = 128; inputSize = [28 28]; numChannels = 3; X = rand(inputSize(1),inputSize(2),numChannels,miniBatchSize); X = dlarray(X,"SSCB");

View the size and format of the input data.

Apply the ReLU operation using the relu function.

View the size and format of the output.

Output Arguments

collapse all

ReLU activations, returned as a dlarray. The outputY has the same underlying data type as the inputX.

If the input data X is a formatted dlarray,Y has the same dimension format as X. If the input data is not a formatted dlarray, Y is an unformatted dlarray with the same dimension order as the input data.

Extended Capabilities

expand all

The relu function supports GPU array input with these usage notes and limitations:

For more information, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox).

Version History

Introduced in R2019b