ELULayer - Exponential linear unit (ELU) layer - MATLAB (original) (raw)
Main Content
Exponential linear unit (ELU) layer
Description
An ELU activation layer performs the identity operation on positive inputs and an exponential nonlinearity on negative inputs.
The layer performs the following operation:
The default value of α is 1. Specify a value of_α_ for the layer by setting the Alpha property.
Creation
Syntax
Description
`layer` = eluLayer
creates an ELU layer.
`layer` = eluLayer(`alpha`)
creates an ELU layer and specifies the Alpha property.
`layer` = eluLayer(___,'Name',`Name`)
additionally sets the optional Name property using any of the previous syntaxes. For example, eluLayer('Name','elu1')
creates an ELU layer with the name 'elu1'
.
Properties
ELU
Nonlinearity parameter α, specified as a finite real scalar. The minimum value of the output of the ELU layer equals -α and the slope at negative inputs approaching 0 is_α_.
Layer
Data Types: char
| string
This property is read-only.
Number of inputs to the layer, stored as 1
. This layer accepts a single input only.
Data Types: double
This property is read-only.
Input names, stored as {'in'}
. This layer accepts a single input only.
Data Types: cell
This property is read-only.
Number of outputs from the layer, stored as 1
. This layer has a single output only.
Data Types: double
This property is read-only.
Output names, stored as {'out'}
. This layer has a single output only.
Data Types: cell
Examples
Create an exponential linear unit (ELU) layer with the name 'elu1'
and a default value of 1 for the nonlinearity parameter Alpha
.
layer = eluLayer(Name="elu1")
layer = ELULayer with properties:
Name: 'elu1'
Alpha: 1
Learnable Parameters No properties.
State Parameters No properties.
Show all properties
Include an ELU layer in a Layer
array.
layers = [ imageInputLayer([28 28 1]) convolution2dLayer(3,16) batchNormalizationLayer eluLayer
maxPooling2dLayer(2,'Stride',2)
convolution2dLayer(3,32)
batchNormalizationLayer
eluLayer
fullyConnectedLayer(10)
softmaxLayer]
layers = 10×1 Layer array with layers:
1 '' Image Input 28×28×1 images with 'zerocenter' normalization
2 '' 2-D Convolution 16 3×3 convolutions with stride [1 1] and padding [0 0 0 0]
3 '' Batch Normalization Batch normalization
4 '' ELU ELU with Alpha 1
5 '' 2-D Max Pooling 2×2 max pooling with stride [2 2] and padding [0 0 0 0]
6 '' 2-D Convolution 32 3×3 convolutions with stride [1 1] and padding [0 0 0 0]
7 '' Batch Normalization Batch normalization
8 '' ELU ELU with Alpha 1
9 '' Fully Connected 10 fully connected layer
10 '' Softmax softmax
Algorithms
Layers in a layer array or layer graph pass data to subsequent layers as formatted dlarray objects. The format of a dlarray
object is a string of characters in which each character describes the corresponding dimension of the data. The format consists of one or more of these characters:
"S"
— Spatial"C"
— Channel"B"
— Batch"T"
— Time"U"
— Unspecified
For example, you can describe 2-D image data that is represented as a 4-D array, where the first two dimensions correspond to the spatial dimensions of the images, the third dimension corresponds to the channels of the images, and the fourth dimension corresponds to the batch dimension, as having the format "SSCB"
(spatial, spatial, channel, batch).
ELULayer
objects apply an element-wise operation and support input data of any format. The layer does not add or remove any dimensions, so it outputs data with the same format as its input data.
References
[1] Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and accurate deep network learning by exponential linear units (ELUs)." arXiv preprint arXiv:1511.07289 (2015).
Extended Capabilities
Version History
Introduced in R2019a