Define Custom Recurrent Deep Learning Layer - MATLAB & Simulink (original) (raw)

If Deep Learning Toolbox™ does not provide the layer you require for your task, then you can define your own custom layer using this example as a guide. For a list of built-in layers, see List of Deep Learning Layers.

To define a custom deep learning layer, you can use the template provided in this example, which takes you through these steps:

  1. Name the layer — Give the layer a name so that you can use it in MATLAB®.
  2. Declare the layer properties — Specify the properties of the layer, including learnable parameters and state parameters.
  3. Create the constructor function (optional) — Specify how to construct the layer and initialize its properties. If you do not specify a constructor function, then at creation, the software initializes theName, Description, andType properties with [] and sets the number of layer inputs and outputs to 1.
  4. Create initialize function (optional) — Specify how to initialize the learnable and state parameters when the software initializes the network. If you do not specify an initialize function, then the software does not initialize parameters when it initializes the network.
  5. Create forward functions — Specify how data passes forward through the layer (forward propagation) at prediction time and at training time.
  6. Create reset state function (optional) — Specify how to reset state parameters.
  7. Create a backward function (optional) — Specify the derivatives of the loss with respect to the input data and the learnable parameters (backward propagation). If you do not specify a backward function, then the forward functions must support dlarray objects.

When you define the layer functions, you can use dlarray objects.Using dlarray objects makes working with high dimensional data easier by allowing you to label the dimensions. For example, you can label which dimensions correspond to spatial, time, channel, and batch dimensions using the"S", "T", "C", and"B" labels, respectively. For unspecified and other dimensions, use the"U" label. For dlarray object functions that operate over particular dimensions, you can specify the dimension labels by formatting thedlarray object directly, or by using the DataFormat option.

Using formatted dlarray objects in custom layers also allows you to define layers where the inputs and outputs have different formats, such as layers that permute, add, or remove dimensions. For example, you can define a layer that takes as input a mini-batch of images with the format "SSCB" (spatial, spatial, channel, batch) and output a mini-batch of sequences with the format "CBT" (channel, batch, time). Using formatted dlarray objects also allows you to define layers that can operate on data with different input formats, for example, layers that support inputs with the formats "SSCB" (spatial, spatial, channel, batch) and "CBT" (channel, batch, time).

dlarray objects also enable support for automatic differentiation. Consequently, if your forward functions fully support dlarray objects, then defining the backward function is optional.

To enable support for using formatted dlarray objects in custom layer forward functions, also inherit from the nnet.layer.Formattable class when defining the custom layer. For an example, see Define Custom Deep Learning Layer with Formatted Inputs.

This example shows how to define a peephole LSTM layer [1], which is a recurrent layer with learnable parameters, and use it in a neural network. A peephole LSTM layer is a variant of an LSTM layer, where the gate calculations use the layer cell state.

Custom Layer Template

Copy the custom layer template into a new file in MATLAB. This template gives the structure of a layer class definition. It outlines:

classdef myLayer < nnet.layer.Layer % ... % & nnet.layer.Formattable ... % (Optional) % & nnet.layer.Acceleratable % (Optional)

properties
    % (Optional) Layer properties.

    % Declare layer properties here.
end

properties (Learnable)
    % (Optional) Layer learnable parameters.

    % Declare learnable parameters here.
end

properties (State)
    % (Optional) Layer state parameters.

    % Declare state parameters here.
end

properties (Learnable, State)
    % (Optional) Nested dlnetwork objects with both learnable
    % parameters and state parameters.

    % Declare nested networks with learnable and state parameters here.
end

methods
    function layer = myLayer()
        % (Optional) Create a myLayer.
        % This function must have the same name as the class.

        % Define layer constructor function here.
    end

    function layer = initialize(layer,layout)
        % (Optional) Initialize layer learnable and state parameters.
        %
        % Inputs:
        %         layer  - Layer to initialize
        %         layout - Data layout, specified as a networkDataLayout
        %                  object
        %
        % Outputs:
        %         layer - Initialized layer
        %
        %  - For layers with multiple inputs, replace layout with 
        %    layout1,...,layoutN, where N is the number of inputs.
        
        % Define layer initialization function here.
    end
    

    function [Y,state] = predict(layer,X)
        % Forward input data through the layer at prediction time and
        % output the result and updated state.
        %
        % Inputs:
        %         layer - Layer to forward propagate through 
        %         X     - Input data
        % Outputs:
        %         Y     - Output of layer forward function
        %         state - (Optional) Updated layer state
        %
        %  - For layers with multiple inputs, replace X with X1,...,XN, 
        %    where N is the number of inputs.
        %  - For layers with multiple outputs, replace Y with 
        %    Y1,...,YM, where M is the number of outputs.
        %  - For layers with multiple state parameters, replace state 
        %    with state1,...,stateK, where K is the number of state 
        %    parameters.

        % Define layer predict function here.
    end

    function [Y,state,memory] = forward(layer,X)
        % (Optional) Forward input data through the layer at training
        % time and output the result, the updated state, and a memory
        % value.
        %
        % Inputs:
        %         layer - Layer to forward propagate through 
        %         X     - Layer input data
        % Outputs:
        %         Y      - Output of layer forward function 
        %         state  - (Optional) Updated layer state 
        %         memory - (Optional) Memory value for custom backward
        %                  function
        %
        %  - For layers with multiple inputs, replace X with X1,...,XN, 
        %    where N is the number of inputs.
        %  - For layers with multiple outputs, replace Y with 
        %    Y1,...,YM, where M is the number of outputs.
        %  - For layers with multiple state parameters, replace state 
        %    with state1,...,stateK, where K is the number of state 
        %    parameters.

        % Define layer forward function here.
    end

    function layer = resetState(layer)
        % (Optional) Reset layer state.

        % Define reset state function here.
    end

    function [dLdX,dLdW,dLdSin] = backward(layer,X,Y,dLdY,dLdSout,memory)
        % (Optional) Backward propagate the derivative of the loss
        % function through the layer.
        %
        % Inputs:
        %         layer   - Layer to backward propagate through 
        %         X       - Layer input data 
        %         Y       - Layer output data 
        %         dLdY    - Derivative of loss with respect to layer 
        %                   output
        %         dLdSout - (Optional) Derivative of loss with respect 
        %                   to state output
        %         memory  - Memory value from forward function
        % Outputs:
        %         dLdX   - Derivative of loss with respect to layer input
        %         dLdW   - (Optional) Derivative of loss with respect to
        %                  learnable parameter 
        %         dLdSin - (Optional) Derivative of loss with respect to 
        %                  state input
        %
        %  - For layers with state parameters, the backward syntax must
        %    include both dLdSout and dLdSin, or neither.
        %  - For layers with multiple inputs, replace X and dLdX with
        %    X1,...,XN and dLdX1,...,dLdXN, respectively, where N is
        %    the number of inputs.
        %  - For layers with multiple outputs, replace Y and dLdY with
        %    Y1,...,YM and dLdY,...,dLdYM, respectively, where M is the
        %    number of outputs.
        %  - For layers with multiple learnable parameters, replace 
        %    dLdW with dLdW1,...,dLdWP, where P is the number of 
        %    learnable parameters.
        %  - For layers with multiple state parameters, replace dLdSin
        %    and dLdSout with dLdSin1,...,dLdSinK and 
        %    dLdSout1,...,dldSoutK, respectively, where K is the number
        %    of state parameters.

        % Define layer backward function here.
    end
end

end

Name Layer

First, give the layer a name. In the first line of the class file, replace the existing name myLayer with peepholeLSTMLayer. To allow the layer to output different data formats, for example data with the format"CBT" (channel, batch, time) for sequence output and the format"CB" (channel, batch) for single time step or feature output, also include the nnet.layer.Formattable class.

classdef peepholeLSTMLayer < nnet.layer.Layer & nnet.layer.Formattable ... end

Next, rename the myLayer constructor function (the first function in the methods section) so that it has the same name as the layer.

methods
    function layer = peepholeLSTMLayer()           
        ...
    end

    ...
 end

Save Layer

Save the layer class file in a new file namedpeepholeLSTMLayer.m. The file name must match the layer name. To use the layer, you must save the file in the current folder or in a folder on the MATLAB path.

Declare Properties, State, and Learnable Parameters

Declare the layer properties in the properties section, the layer states in the properties (State) section, and the learnable parameters in the properties (Learnable) section.

By default, custom layers have these properties. Do not declare these properties in theproperties section.

Property Description
Name Layer name, specified as a character vector or string scalar. For Layer array input, the trainnet anddlnetwork functions automatically assign names to layers with the name "".
Description One-line description of the layer, specified as a string scalar or a character vector. This description appears when the layer is displayed in a Layer array.If you do not specify a layer description, then the software displays the layer class name.
Type Type of the layer, specified as a character vector or a string scalar. The value of Type appears when the layer is displayed in a Layer array.If you do not specify a layer type, then the software displays the layer class name.
NumInputs Number of inputs of the layer, specified as a positive integer. If you do not specify this value, then the software automatically setsNumInputs to the number of names inInputNames. The default value is 1.
InputNames Input names of the layer, specified as a cell array of character vectors. If you do not specify this value andNumInputs is greater than 1, then the software automatically sets InputNames to{'in1',...,'inN'}, where N is equal to NumInputs. The default value is{'in'}.
NumOutputs Number of outputs of the layer, specified as a positive integer. If you do not specify this value, then the software automatically setsNumOutputs to the number of names inOutputNames. The default value is 1.
OutputNames Output names of the layer, specified as a cell array of character vectors. If you do not specify this value andNumOutputs is greater than 1, then the software automatically sets OutputNames to{'out1',...,'outM'}, where M is equal to NumOutputs. The default value is{'out'}.

If the layer has no other properties, then you can omit the properties section.

Tip

If you are creating a layer with multiple inputs, then you must set either the NumInputs or InputNames properties in the layer constructor. If you are creating a layer with multiple outputs, then you must set either the NumOutputs or OutputNames properties in the layer constructor. For an example, see Define Custom Deep Learning Layer with Multiple Inputs.

Declare the following layer properties in the properties section:

A peephole LSTM layer has four learnable parameters: the input weights, the recurrent weights, the peephole weights, and the bias. Declare these learnable parameters in theproperties (Learnable) section with the namesInputWeights, RecurrentWeights,PeepholeWeights, and Bias, respectively.

properties (Learnable)
    % Layer learnable parameters.

    InputWeights
    RecurrentWeights
    PeepholeWeights
    Bias
end

A peephole LSTM layer has two state parameters: the hidden state and the cell state. Declare these state parameters in the properties (State) section with the names HiddenState and CellState, respectively.

properties (State)
    % Layer state parameters.

    HiddenState
    CellState
end

Parallel training of networks containing custom layers with state parameters using thetrainnet function is not supported. When you train a network with custom layers with state parameters, the ExecutionEnvironment training option must be "auto", "gpu", or"cpu".

Create Constructor Function

Create the function that constructs the layer and initializes the layer properties. Specify any variables required to create the layer as inputs to the constructor function.

The peephole LSTM layer constructor function requires one input argument (the number of hidden units) and two optional arguments (the layer name and output mode). Specify one input arguments named numHiddenUnits in thepeepholeLSTMLayer function that corresponds to the number of hidden units. Specify the optional input arguments as a single argument with the nameargs. Add a comment to the top of the function that explains the syntaxes of the function.

    function layer = peepholeLSTMLayer(numHiddenUnits,args)
        %PEEPHOLELSTMLAYER Peephole LSTM Layer
        %   layer = peepholeLSTMLayer(numHiddenUnits)
        %   creates a peephole LSTM layer with the specified number of
        %   hidden units.
        %
        %   layer = peepholeLSTMLayer(numHiddenUnits,Name=Value)
        %   creates a peephole LSTM layer and specifies additional
        %   options using one or more name-value arguments:
        %
        %      Name       - Name of the layer, specified as a string.
        %                   The default is "".
        %
        %      OutputMode - Output mode, specified as one of the
        %                   following:
        %                      "sequence" - Output the entire sequence
        %                                   of data.
        %
        %                      "last"     - Output the last time step
        %                                   of the data.
        %                   The default is "sequence".

        ...
    end

Initialize Layer Properties

Initialize the layer properties in the constructor function. Replace the comment % Layer constructor function goes here with code that initializes the layer properties. Do not initialize learnable or state parameters in the constructor function, initialize them in the initialize function instead.

Parse the input arguments using an arguments block and set theName and output properties.

        arguments
            numHiddenUnits
            args.Name = "";
            args.OutputMode = "sequence"
        end

        layer.NumHiddenUnits = numHiddenUnits;
        layer.Name = args.Name;
        layer.OutputMode = args.OutputMode;

Give the layer a one-line description by setting theDescription property of the layer. Set the description to describe the type of the layer and its size.

        % Set layer description.
        layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units";

View the completed constructor function.

    function layer = peepholeLSTMLayer(numHiddenUnits,args)
        %PEEPHOLELSTMLAYER Peephole LSTM Layer
        %   layer = peepholeLSTMLayer(numHiddenUnits)
        %   creates a peephole LSTM layer with the specified number of
        %   hidden units.
        %
        %   layer = peepholeLSTMLayer(numHiddenUnits,Name=Value)
        %   creates a peephole LSTM layer and specifies additional
        %   options using one or more name-value arguments:
        %
        %      Name       - Name of the layer, specified as a string.
        %                   The default is "".
        %
        %      OutputMode - Output mode, specified as one of the
        %                   following:
        %                      "sequence" - Output the entire sequence
        %                                   of data.
        %
        %                      "last"     - Output the last time step
        %                                   of the data.
        %                   The default is "sequence".

        % Parse input arguments.
        arguments
            numHiddenUnits
            args.Name = "";
            args.OutputMode = "sequence";
        end

        layer.NumHiddenUnits = numHiddenUnits;
        layer.Name = args.Name;
        layer.OutputMode = args.OutputMode;

        % Set layer description.
        layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units";
    end

With this constructor function, the commandpeepholeLSTMLayer(200,OutputMode="last",Name="peephole") creates a peephole LSTM layer with 200 hidden units, and the name"peephole", and outputs the last time step of the peephole LSTM operation.

Create Initialize Function

Create the function that initializes the layer learnable and state parameters when the software initializes the network. Ensure that the function only initializes learnable and state parameters when the property is empty, otherwise the software can overwrite when you load the network from a MAT file.

Because the size of the input data is unknown until the network is ready to use, you must create an initialize function that initializes the learnable and state parameters using networkDataLayout objects that the software provides to the function. Network data layout objects contain information about the sizes and formats of expected input data. Create an initialize function that uses the size and format information to initialize learnable and state parameters such that they have the correct size.

Initialize the input weights using Glorot initialization. Initialize the recurrent weights using orthogonal initialization. Initialize the bias using unit-forget-gate normalization. This code uses the helper functions initializeGlorot,initializeOrthogonal, andinitializeUnitForgetGate. To access these functions, open the example in the Include Custom Layer in Network section as a live script. For more information about initializing weights, see Initialize Learnable Parameters for Model Function.

Note that the recurrent weights of a peephole LSTM layer and standard LSTM layers have different sizes. A peephole LSTM layer does not require recurrent weights for the cell candidate calculation, so the recurrent weights is a3*NumHiddenUnits-by-NumHiddenUnits array.

For convenience, initialize the state parameters using theresetState function defined in the section Create Reset State Function.

    function layer = initialize(layer,layout)
        % layer = initialize(layer,layout) initializes the layer
        % learnable and state parameters.
        %
        % Inputs:
        %         layer  - Layer to initialize.
        %         layout - Data layout, specified as a
        %                  networkDataLayout object.
        %
        % Outputs:
        %         layer - Initialized layer.

        numHiddenUnits = layer.NumHiddenUnits;

        % Find number of channels.
        idx = finddim(layout,"C");
        numChannels = layout.Size(idx);

        % Initialize input weights.
        if isempty(layer.InputWeights)
            sz = [4*numHiddenUnits numChannels];
            numOut = 4*numHiddenUnits;
            numIn = numChannels;
            layer.InputWeights = initializeGlorot(sz,numOut,numIn);
        end

        % Initialize recurrent weights.
        if isempty(layer.RecurrentWeights)
            sz = [4*numHiddenUnits numHiddenUnits];
            layer.RecurrentWeights = initializeOrthogonal(sz);
        end

        % Initialize peephole weights.
        if isempty(layer.PeepholeWeights)
            sz = [3*numHiddenUnits 1];
            numOut = 3*numHiddenUnits;
            numIn = 1;

            layer.PeepholeWeights = initializeGlorot(sz,numOut,numIn);
        end

        % Initialize bias.
        if isempty(layer.Bias)
            layer.Bias = initializeUnitForgetGate(numHiddenUnits);
        end

        % Initialize hidden state.
        if isempty(layer.HiddenState)
            layer.HiddenState = zeros(numHiddenUnits,1);
        end

        % Initialize cell state.
        if isempty(layer.CellState)
            layer.CellState = zeros(numHiddenUnits,1);
        end
    end

Create Predict Function

Create the layer forward functions to use at prediction time and training time.

Create a function named predict that propagates the data forward through the layer at prediction time and outputs the result.

The predict function syntax depends on the type of layer.

You can adjust the syntaxes for layers with multiple inputs, multiple outputs, or multiple state parameters:

Tip

If the number of inputs to the layer can vary, then use varargin instead of X1,…,XN. In this case, varargin is a cell array of the inputs, where varargin{i} corresponds to Xi.

If the number of outputs can vary, then use varargout instead of Y1,…,YM. In this case, varargout is a cell array of the outputs, where varargout{j} corresponds to Yj.

Tip

If the custom layer has a dlnetwork object for a learnable parameter, then in the predict function of the custom layer, use thepredict function for the dlnetwork. When you do so, the dlnetwork object predict function uses the appropriate layer operations for prediction. If the dlnetwork has state parameters, then also return the network state.

Because a peephole LSTM layer has only one input, one output, and two state parameters, the syntax for predict for a peephole LSTM layer is[Y,hiddenState,cellState] = predict(layer,X).

By default, the layer uses predict as the forward function at training time. To use a different forward function at training time, or retain a value required for a custom backward function, you must also create a function namedforward.

Because the layer inherits from nnet.layer.Formattable, the layer inputs are formatted dlarray objects and the predict function must also output data as formatted dlarray objects.

The hidden state at time step t is given by

⊙ denotes the Hadamard product (element-wise multiplication of vectors).

The cell state at time step t is given by

The following formulas describe the components at time step_t_.

Component Formula
Input gate it=σg(Wixt+Riht−1+pi⊙ct−1+bi)
Forget gate ft=σg(Wfxt+​Rfht−1+pf⊙ct−1+bf)
Cell candidate gt=σc(Wgxt+Rh​ht−1+bg)
Output gate ot=σg(Woxt+Roht−1+po⊙ct+bo)

Note that the output gate calculation requires the updated cell state ct.

In these calculations, σg and σc denote the gate and state activation functions. For peephole LSTM layers, use the sigmoid and hyperbolic tangent functions as the gate and state activation functions, respectively.

Implement this operation in the predict function. Because the layer does not require a different forward function for training or a memory value for a custom backward function, you can remove the forward function from the class file. Add a comment to the top of the function that explains the syntaxes of the function.

Tip

If you preallocate arrays using functions such aszeros, then you must ensure that the data types of these arrays are consistent with the layer function inputs. To create an array of zeros of the same data type as another array, use the "like" option of zeros. For example, to initialize an array of zeros of size sz with the same data type as the array X, use Y = zeros(sz,"like",X).

    function [Y,cellState,hiddenState] = predict(layer,X)
        %PREDICT Peephole LSTM predict function
        %   [Y,hiddenState,cellState] = predict(layer,X) forward
        %   propagates the data X through the layer and returns the
        %   layer output Y and the updated hidden and cell states. X
        %   is a dlarray with format "CBT" and Y is a dlarray with
        %   format "CB" or "CBT", depending on the layer OutputMode
        %   property.

        % Initialize sequence output.
        numHiddenUnits = layer.NumHiddenUnits;
        miniBatchSize = size(X,finddim(X,"B"));
        numTimeSteps = size(X,finddim(X,"T"));

        if layer.OutputMode == "sequence"
            Y = zeros(numHiddenUnits,miniBatchSize,numTimeSteps,"like",X);
            Y = dlarray(Y,"CBT");
        end

        % Calculate WX + b.
        X = stripdims(X);
        WX = pagemtimes(layer.InputWeights,X) + layer.Bias;

        % Indices of concatenated weight arrays.
        idx1 = 1:numHiddenUnits;
        idx2 = 1+numHiddenUnits:2*numHiddenUnits;
        idx3 = 1+2*numHiddenUnits:3*numHiddenUnits;
        idx4 = 1+3*numHiddenUnits:4*numHiddenUnits;

        % Initial states.
        hiddenState = layer.HiddenState;
        cellState = layer.CellState;

        % Loop over time steps.
        for t = 1:numTimeSteps
            % Calculate R*h_{t-1}.
            Rht = layer.RecurrentWeights * hiddenState;

            % Calculate p*c_{t-1}.
            pict = layer.PeepholeWeights(idx1) .* cellState;
            pfct = layer.PeepholeWeights(idx2) .* cellState;

            % Gate calculations.
            it = sigmoid(WX(idx1,:,t) + Rht(idx1,:) + pict);
            ft = sigmoid(WX(idx2,:,t) + Rht(idx2,:) + pfct);
            gt = tanh(WX(idx3,:,t) + Rht(idx3,:));
            
            % Calculate ot using updated cell state.
            cellState = gt .* it + cellState .* ft;
            poct = layer.PeepholeWeights(idx3) .* cellState;
            ot = sigmoid(WX(idx4,:,t) + Rht(idx4,:) + poct);

            % Update hidden state.
            hiddenState = tanh(cellState) .* ot;

            % Update sequence output.
            if layer.OutputMode == "sequence"
                Y(:,:,t) = hiddenState;
            end
        end

        % Last time step output.
        if layer.OutputMode == "last"
            Y = dlarray(hiddenState,"CB");
        end
    end

Because the predict function uses only functions that supportdlarray objects, defining the backward function is optional. For a list of functions that support dlarray objects, see List of Functions with dlarray Support.

Create Reset State Function

The resetState function for dlnetwork objects, by default, has no effect on custom layers with state parameters. To define the layer behavior for the resetState function for network objects, define the optional layer resetState function in the layer definition that resets the state parameters.

The resetState function must have the syntax layer = resetState(layer), where the returned layer has the reset state properties.

The resetState function must not set any layer properties except for learnable and state properties. If the function sets other layers properties, then the layer can behave unexpectedly. (since R2023a)

Create a function named resetState that resets the layer state parameters to vectors of zeros.

    function layer = resetState(layer)
        %RESETSTATE Reset layer state
        % layer = resetState(layer) resets the state properties of the
        % layer.

        numHiddenUnits = layer.NumHiddenUnits;
        layer.HiddenState = zeros(numHiddenUnits,1);
        layer.CellState = zeros(numHiddenUnits,1);
    end

Completed Layer

View the completed layer class file.

classdef peepholeLSTMLayer < nnet.layer.Layer & nnet.layer.Formattable %PEEPHOLELSTMLAYER Peephole LSTM Layer

properties
    % Layer properties.

    NumHiddenUnits
    OutputMode
end

properties (Learnable)
    % Layer learnable parameters.

    InputWeights
    RecurrentWeights
    PeepholeWeights
    Bias
end

properties (State)
    % Layer state parameters.

    HiddenState
    CellState
end

methods
    function layer = peepholeLSTMLayer(numHiddenUnits,args)
        %PEEPHOLELSTMLAYER Peephole LSTM Layer
        %   layer = peepholeLSTMLayer(numHiddenUnits)
        %   creates a peephole LSTM layer with the specified number of
        %   hidden units.
        %
        %   layer = peepholeLSTMLayer(numHiddenUnits,Name=Value)
        %   creates a peephole LSTM layer and specifies additional
        %   options using one or more name-value arguments:
        %
        %      Name       - Name of the layer, specified as a string.
        %                   The default is "".
        %
        %      OutputMode - Output mode, specified as one of the
        %                   following:
        %                      "sequence" - Output the entire sequence
        %                                   of data.
        %
        %                      "last"     - Output the last time step
        %                                   of the data.
        %                   The default is "sequence".

        % Parse input arguments.
        arguments
            numHiddenUnits
            args.Name = "";
            args.OutputMode = "sequence";
        end

        layer.NumHiddenUnits = numHiddenUnits;
        layer.Name = args.Name;
        layer.OutputMode = args.OutputMode;

        % Set layer description.
        layer.Description = "Peephole LSTM with " + numHiddenUnits + " hidden units";
    end

    function layer = initialize(layer,layout)
        % layer = initialize(layer,layout) initializes the layer
        % learnable and state parameters.
        %
        % Inputs:
        %         layer  - Layer to initialize.
        %         layout - Data layout, specified as a
        %                  networkDataLayout object.
        %
        % Outputs:
        %         layer - Initialized layer.

        numHiddenUnits = layer.NumHiddenUnits;

        % Find number of channels.
        idx = finddim(layout,"C");
        numChannels = layout.Size(idx);

        % Initialize input weights.
        if isempty(layer.InputWeights)
            sz = [4*numHiddenUnits numChannels];
            numOut = 4*numHiddenUnits;
            numIn = numChannels;
            layer.InputWeights = initializeGlorot(sz,numOut,numIn);
        end

        % Initialize recurrent weights.
        if isempty(layer.RecurrentWeights)
            sz = [4*numHiddenUnits numHiddenUnits];
            layer.RecurrentWeights = initializeOrthogonal(sz);
        end

        % Initialize peephole weights.
        if isempty(layer.PeepholeWeights)
            sz = [3*numHiddenUnits 1];
            numOut = 3*numHiddenUnits;
            numIn = 1;

            layer.PeepholeWeights = initializeGlorot(sz,numOut,numIn);
        end

        % Initialize bias.
        if isempty(layer.Bias)
            layer.Bias = initializeUnitForgetGate(numHiddenUnits);
        end

        % Initialize hidden state.
        if isempty(layer.HiddenState)
            layer.HiddenState = zeros(numHiddenUnits,1);
        end

        % Initialize cell state.
        if isempty(layer.CellState)
            layer.CellState = zeros(numHiddenUnits,1);
        end
    end

    function [Y,cellState,hiddenState] = predict(layer,X)
        %PREDICT Peephole LSTM predict function
        %   [Y,hiddenState,cellState] = predict(layer,X) forward
        %   propagates the data X through the layer and returns the
        %   layer output Y and the updated hidden and cell states. X
        %   is a dlarray with format "CBT" and Y is a dlarray with
        %   format "CB" or "CBT", depending on the layer OutputMode
        %   property.

        % Initialize sequence output.
        numHiddenUnits = layer.NumHiddenUnits;
        miniBatchSize = size(X,finddim(X,"B"));
        numTimeSteps = size(X,finddim(X,"T"));

        if layer.OutputMode == "sequence"
            Y = zeros(numHiddenUnits,miniBatchSize,numTimeSteps,"like",X);
            Y = dlarray(Y,"CBT");
        end

        % Calculate WX + b.
        X = stripdims(X);
        WX = pagemtimes(layer.InputWeights,X) + layer.Bias;

        % Indices of concatenated weight arrays.
        idx1 = 1:numHiddenUnits;
        idx2 = 1+numHiddenUnits:2*numHiddenUnits;
        idx3 = 1+2*numHiddenUnits:3*numHiddenUnits;
        idx4 = 1+3*numHiddenUnits:4*numHiddenUnits;

        % Initial states.
        hiddenState = layer.HiddenState;
        cellState = layer.CellState;

        % Loop over time steps.
        for t = 1:numTimeSteps
            % Calculate R*h_{t-1}.
            Rht = layer.RecurrentWeights * hiddenState;

            % Calculate p*c_{t-1}.
            pict = layer.PeepholeWeights(idx1) .* cellState;
            pfct = layer.PeepholeWeights(idx2) .* cellState;

            % Gate calculations.
            it = sigmoid(WX(idx1,:,t) + Rht(idx1,:) + pict);
            ft = sigmoid(WX(idx2,:,t) + Rht(idx2,:) + pfct);
            gt = tanh(WX(idx3,:,t) + Rht(idx3,:));
            
            % Calculate ot using updated cell state.
            cellState = gt .* it + cellState .* ft;
            poct = layer.PeepholeWeights(idx3) .* cellState;
            ot = sigmoid(WX(idx4,:,t) + Rht(idx4,:) + poct);

            % Update hidden state.
            hiddenState = tanh(cellState) .* ot;

            % Update sequence output.
            if layer.OutputMode == "sequence"
                Y(:,:,t) = hiddenState;
            end
        end

        % Last time step output.
        if layer.OutputMode == "last"
            Y = dlarray(hiddenState,"CB");
        end
    end

    function layer = resetState(layer)
        %RESETSTATE Reset layer state
        % layer = resetState(layer) resets the state properties of the
        % layer.

        numHiddenUnits = layer.NumHiddenUnits;
        layer.HiddenState = zeros(numHiddenUnits,1);
        layer.CellState = zeros(numHiddenUnits,1);
    end
end

end

GPU Compatibility

If the layer forward functions fully support dlarray objects, then the layer is GPU compatible. Otherwise, to be GPU compatible, the layer functions must support inputs and return outputs of type gpuArray (Parallel Computing Toolbox).

Many MATLAB built-in functions support gpuArray (Parallel Computing Toolbox) and dlarray input arguments. For a list of functions that support dlarray objects, see List of Functions with dlarray Support. For a list of functions that execute on a GPU, see Run MATLAB Functions on a GPU (Parallel Computing Toolbox). To use a GPU for deep learning, you must also have a supported GPU device. For information on supported devices, seeGPU Computing Requirements (Parallel Computing Toolbox). For more information on working with GPUs in MATLAB, see GPU Computing in MATLAB (Parallel Computing Toolbox).

In this example, the MATLAB functions used in predict all supportdlarray objects, so the layer is GPU compatible.

Include Custom Layer in Network

You can use a custom layer in the same way as any other layer in Deep Learning Toolbox. Create and train a network for sequence classification using the peephole LSTM layer you created earlier.

Load the example training data.

load JapaneseVowelsTrainData

Define the network architecture. Create a layer array containing a peephole LSTM layer.

inputSize = 12; numHiddenUnits = 100; numClasses = 9;

layers = [ sequenceInputLayer(inputSize) peepholeLSTMLayer(numHiddenUnits,OutputMode="last") fullyConnectedLayer(numClasses) softmaxLayer classificationLayer];

Specify the training options and train the network. Train using the CPU with a mini-batch size of 27 and left-pad the data.

options = trainingOptions("adam", ... ExecutionEnvironment="cpu", ... MiniBatchSize=27, ... SequencePaddingDirection="left"); net = trainNetwork(XTrain,TTrain,layers,options);

|========================================================================================| | Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning | | | | (hh:mm:ss) | Accuracy | Loss | Rate | |========================================================================================| | 1 | 1 | 00:00:01 | 3.70% | 2.2060 | 0.0010 | | 5 | 50 | 00:00:08 | 92.59% | 0.5917 | 0.0010 | | 10 | 100 | 00:00:15 | 92.59% | 0.2182 | 0.0010 | | 15 | 150 | 00:00:21 | 100.00% | 0.0588 | 0.0010 | | 20 | 200 | 00:00:28 | 96.30% | 0.0937 | 0.0010 | | 25 | 250 | 00:00:34 | 100.00% | 0.0497 | 0.0010 | | 30 | 300 | 00:00:40 | 100.00% | 0.0173 | 0.0010 | |========================================================================================| Training finished: Max epochs completed.

Evaluate the network performance by predicting on new data and calculating the accuracy.

load JapaneseVowelsTestData YTest = classify(net,XTest,MiniBatchSize=27); accuracy = mean(YTest==TTest)

References

[1] Greff, Klaus, Rupesh K. Srivastava, Jan Koutník, Bas R. Steunebrink, and Jürgen Schmidhuber. "LSTM: A Search Space Odyssey." IEEE Transactions on Neural Networks and Learning Systems 28, no. 10 (2016): 2222–2232.

See Also

trainnet | trainingOptions | dlnetwork | functionLayer | checkLayer | setLearnRateFactor | setL2Factor | getLearnRateFactor | getL2Factor | findPlaceholderLayers | replaceLayer | PlaceholderLayer | networkDataLayout