SwishLayer - Swish layer - MATLAB (original) (raw)

Main Content

Description

A swish activation layer applies the swish function on the layer inputs.

The swish operation is given by f(x)=x1+e−x.

Creation

Syntax

Description

`layer` = swishLayer creates a swish layer.

`layer` = swishLayer('Name',`Name`) creates a swish layer and sets the optional Name property using a name-value argument. For example,swishLayer('Name','swish1') creates a swish layer with the name'swish1'.

example

Properties

expand all

Data Types: char | string

This property is read-only.

Number of inputs to the layer, stored as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, stored as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, stored as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, stored as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a swish layer with the name 'swish1'.

layer = swishLayer('Name','swish1')

layer = SwishLayer with properties:

Name: 'swish1'

Learnable Parameters No properties.

State Parameters No properties.

Show all properties

Include a swish layer in a Layer array.

layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) batchNormalizationLayer swishLayer maxPooling2dLayer(2,'Stride',2) fullyConnectedLayer(10) softmaxLayer]

layers = 7×1 Layer array with layers:

 1   ''   Image Input           28×28×1 images with 'zerocenter' normalization
 2   ''   2-D Convolution       20 5×5 convolutions with stride [1  1] and padding [0  0  0  0]
 3   ''   Batch Normalization   Batch normalization
 4   ''   Swish                 Swish
 5   ''   2-D Max Pooling       2×2 max pooling with stride [2  2] and padding [0  0  0  0]
 6   ''   Fully Connected       10 fully connected layer
 7   ''   Softmax               softmax

Algorithms

expand all

A swish activation layer applies the swish function on the layer inputs. The swish operation is given by f(x)=x1+e−x. The swish layer does not change the size of its input.

Activation layers such as swish layers improve the training accuracy for some applications and usually follow convolution and normalization layers. Other nonlinear activation layers perform different operations. For a list of activation layers, see Activation Layers.

Layers in a layer array or layer graph pass data to subsequent layers as formatted dlarray objects. The format of a dlarray object is a string of characters in which each character describes the corresponding dimension of the data. The format consists of one or more of these characters:

For example, you can describe 2-D image data that is represented as a 4-D array, where the first two dimensions correspond to the spatial dimensions of the images, the third dimension corresponds to the channels of the images, and the fourth dimension corresponds to the batch dimension, as having the format "SSCB" (spatial, spatial, channel, batch).

SwishLayer objects apply an element-wise operation and support input data of any format. The layer does not add or remove any dimensions, so it outputs data with the same format as its input data.

Extended Capabilities

Version History

Introduced in R2021a