mishLayer - Mish layer - MATLAB (original) (raw)
Description
Use mishLayer
objects to apply the mish function to the layer inputs.
This equation describes the mish operation:
Creation
Syntax
Description
`layer` = dlhdl.layer.mishLayer(`Name`)
creates a mish layer with the name specified by Name. For example, dlhdl.layer.mishLayer("mish1")
creates a mish layer with the name "mish1"
.
Properties
Data Types: char
| string
This property is read-only.
Number of inputs to the layer, stored as 1
. This layer accepts a single input only.
Data Types: double
This property is read-only.
Input names, stored as {'in'}
. This layer accepts a single input only.
Data Types: cell
This property is read-only.
Number of outputs from the layer, stored as 1
. This layer has a single output only.
Data Types: double
This property is read-only.
Output names, stored as {'out'}
. This layer has a single output only.
Data Types: cell
Examples
Create a mish layer with the name "mish1"
.
layer = dlhdl.layer.mishLayer("mish1")
layer = mishLayer with properties:
Name: 'mish1'
Learnable Parameters No properties. State Parameters No properties. Show all properties
Include the mish layer in a Layer
array.
layers = [imageInputLayer([20,20,3],'Normalization',"none",'Name','input') convolution2dLayer([5 5],3,'Padding',[1 2 1 2],'Stride',[1 1],'Name', 'conv') batchNormalizationLayer('Name','batchnorm') dlhdl.layer.mishLayer("mish1") convolution2dLayer([5 5],3,'Padding',[1 2 1 2],'Stride',[2 2],'Name', 'conv') batchNormalizationLayer('Name','batchnorm') swishLayer('Name','swish')]
layers = 7×1 Layer array with layers:
1 'input' Image Input 20×20×3 images
2 'conv' 2-D Convolution 3 5×5 convolutions with stride [1 1] and padding [1 2 1 2]
3 'batchnorm' Batch Normalization Batch normalization
4 'mish1' dlhdl.layer.mishLayer Custom mish Layer
5 'conv' 2-D Convolution 3 5×5 convolutions with stride [2 2] and padding [1 2 1 2]
6 'batchnorm' Batch Normalization Batch normalization
7 'swish' Swish Swish
Algorithms
A mish activation layer applies the mish function to the layer inputs. The mish operation uses this equation f(x)=xtanh(ln(1+ex))
The mish layer does not change the size of the input. Activation layers such as mish layers improve the training accuracy for some applications and usually follow convolution and normalization layers. Other nonlinear activation layers perform different operations. For a list of activation layers, see Activation Layers.
Version History
Introduced in R2024a