setLearnRateFactor - Set learn rate factor of layer learnable parameter - MATLAB (original) (raw)

Set learn rate factor of layer learnable parameter

Syntax

Description

[layerUpdated](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Ff8685e87-8be2-4736-ad2c-16c7ab507509) = setLearnRateFactor([layer](#d126e237363),[parameterName](#d126e237377),[factor](#d126e237389)) sets the learn rate factor of the parameter with the nameparameterName in layer tofactor.

For built-in layers, you can set the learn rate factor directly by using the corresponding property. For example, for a convolution2dLayer layer, the syntax layer = setLearnRateFactor(layer,'Weights',factor) is equivalent tolayer.WeightLearnRateFactor = factor.

example

[layerUpdated](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Ff8685e87-8be2-4736-ad2c-16c7ab507509) = setLearnRateFactor([layer](#d126e237363),[parameterPath](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Fbb76b081-6fd1-4a89-9ec7-fd4394ad67fb),[factor](#d126e237389)) sets the learn rate factor of the parameter specified by the pathparameterPath. Use this syntax when the layer is anetworkLayer or when the parameter is in adlnetwork object in a custom layer.

example

[netUpdated](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Fc00b29df-6f5d-42ad-a5bc-9c6599af47ac) = setLearnRateFactor([net](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5F3ffb42b3-6af7-4c22-82b3-4fb0f6a399f2),[layerName](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5F5e8103c3-6a84-4c6d-a782-b16533603493),[parameterName](#d126e237377),[factor](#d126e237389)) sets the learn rate factor of the parameter with the nameparameterName in the layer with namelayerName for the specified dlnetwork object.

example

[netUpdated](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Fc00b29df-6f5d-42ad-a5bc-9c6599af47ac) = setLearnRateFactor([net](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5F3ffb42b3-6af7-4c22-82b3-4fb0f6a399f2),[parameterPath](#mw%5Fc5d4c1b7-6bb5-41f3-839b-aa3e8e7d3a47%5Fsep%5Fmw%5Fbb76b081-6fd1-4a89-9ec7-fd4394ad67fb),[factor](#d126e237389)) sets the learn rate factor of the parameter specified by the pathparameterPath. Use this syntax when the parameter is in a networkLayer or when the parameter is in adlnetwork object in a custom layer.

example

Examples

collapse all

Set and Get Learning Rate Factor of Learnable Parameter

Set and get the learning rate factor of a learnable parameter of a custom SReLU layer.

Create a layer array containing the custom layer sreluLayer, attached to this example as a supporting file. To access this layer, open this example as a live script.

layers = [ ... imageInputLayer([28 28 1]) convolution2dLayer(5,20) batchNormalizationLayer sreluLayer fullyConnectedLayer(10) softmaxLayer];

Set the learn rate factor of the LeftThreshold learnable parameter of the sreluLayer to 2.

layers(4) = setLearnRateFactor(layers(4),"LeftThreshold",2);

View the updated learn rate factor.

factor = getLearnRateFactor(layers(4),"LeftThreshold")

Set and Get Learning Rate Factor of Custom Nested Layer Learnable Parameter

Set and get the learning rate factor of a learnable parameter of a nested layer defined using network composition.

Create a residual block layer using the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

numFilters = 64; layer = residualBlockLayer(numFilters)

layer = residualBlockLayer with properties:

   Name: ''

Learnable Parameters Network: [1x1 dlnetwork]

State Parameters Network: [1x1 dlnetwork]

Use properties method to see a list of all properties.

View the layers of the nested network.

ans = 7x1 Layer array with layers:

 1   'conv_1'        2-D Convolution       64 3x3 convolutions with stride [1  1] and padding 'same'
 2   'batchnorm_1'   Batch Normalization   Batch normalization
 3   'relu_1'        ReLU                  ReLU
 4   'conv_2'        2-D Convolution       64 3x3 convolutions with stride [1  1] and padding 'same'
 5   'batchnorm_2'   Batch Normalization   Batch normalization
 6   'add'           Addition              Element-wise addition of 2 inputs
 7   'relu_2'        ReLU                  ReLU

Set the learning rate factor of the learnable parameter 'Weights' of the layer 'conv_1' to 2 using the setLearnRateFactor function.

factor = 2; layer = setLearnRateFactor(layer,'Network/conv_1/Weights',factor);

Get the updated learning rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(layer,'Network/conv_1/Weights')

Set and Get Learn Rate Factor of dlnetwork Learnable Parameter

Set and get the learning rate factor of a learnable parameter of a dlnetwork object.

Create a dlnetwork object

net = dlnetwork;

layers = [ imageInputLayer([28 28 1],Normalization="none",Name="in") convolution2dLayer(5,20,Name="conv") batchNormalizationLayer(Name="bn") reluLayer(Name="relu") fullyConnectedLayer(10,Name="fc") softmaxLayer(Name="sm")];

net = addLayers(net,layers);

Set the learn rate factor of the 'Weights' learnable parameter of the convolution layer to 2 using the setLearnRateFactor function.

factor = 2; net = setLearnRateFactor(net,'conv',Weights=factor);

Get the updated learn rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(net,'conv',"Weights")

Set and Get Learning Rate Factor of Nested Layer Learnable Parameter

Create an array of layers containing an lstmLayer with 100 hidden units and a dropoutLayer with a dropout probability of 0.2.

layers = [lstmLayer(100,OutputMode="sequence",Name="lstm") dropoutLayer(0.2,Name="dropout")];

Create a network layer containing these layers.

lstmDropoutLayer = networkLayer(layers,Name="lstmDropout");

Use the network layer to build a network.

layers = [sequenceInputLayer(3) lstmDropoutLayer lstmDropoutLayer fullyConnectedLayer(10) softmaxLayer];

Create a dlnetwork object. You can also create a dlnetwork object by training the network using the trainnet function.

Set the learning rate factor of the InputWeights learnable parameter of the LSTM layer in the first network layer to 2 using the setLearnRateFactor function.

factor = 2; net = setLearnRateFactor(net,"lstmDropout_1/lstm/InputWeights",factor);

Get the updated learning rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(net,"lstmDropout_1/lstm/InputWeights")

Set and Get Learning Rate Factor of Custom Nested dlnetwork Learnable Parameter

Set and get the learning rate factor of a learnable parameter of a custom nested layer defined using network composition in a dlnetwork object.

Create a dlnetwork object containing the custom layer residualBlockLayer attached to this example as a supporting file. To access this file, open this example as a Live Script.

inputSize = [224 224 3]; numFilters = 32; numClasses = 5;

layers = [ imageInputLayer(inputSize,'Normalization','none','Name','in') convolution2dLayer(7,numFilters,'Stride',2,'Padding','same','Name','conv') groupNormalizationLayer('all-channels','Name','gn') reluLayer('Name','relu') maxPooling2dLayer(3,'Stride',2,'Name','max') residualBlockLayer(numFilters,'Name','res1') residualBlockLayer(numFilters,'Name','res2') residualBlockLayer(2numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res3') residualBlockLayer(2numFilters,'Name','res4') residualBlockLayer(4numFilters,'Stride',2,'IncludeSkipConvolution',true,'Name','res5') residualBlockLayer(4numFilters,'Name','res6') globalAveragePooling2dLayer('Name','gap') fullyConnectedLayer(numClasses,'Name','fc') softmaxLayer('Name','sm')];

dlnet = dlnetwork(layers);

View the layers of the nested network in the layer 'res1'.

dlnet.Layers(6).Network.Layers

ans = 7x1 Layer array with layers:

 1   'conv_1'        2-D Convolution       32 3x3x32 convolutions with stride [1  1] and padding 'same'
 2   'batchnorm_1'   Batch Normalization   Batch normalization with 32 channels
 3   'relu_1'        ReLU                  ReLU
 4   'conv_2'        2-D Convolution       32 3x3x32 convolutions with stride [1  1] and padding 'same'
 5   'batchnorm_2'   Batch Normalization   Batch normalization with 32 channels
 6   'add'           Addition              Element-wise addition of 2 inputs
 7   'relu_2'        ReLU                  ReLU

Set the learning rate factor of the learnable parameter 'Weights' of the layer 'conv_1' to 2 using the setLearnRateFactor function.

factor = 2; dlnet = setLearnRateFactor(dlnet,'res1/Network/conv_1/Weights',factor);

Get the updated learning rate factor using the getLearnRateFactor function.

factor = getLearnRateFactor(dlnet,'res1/Network/conv_1/Weights')

Freeze Learnable Parameters

Load a pretrained network.

net = imagePretrainedNetwork;

The Learnables property of the dlnetwork object is a table that contains the learnable parameters of the network. The table includes parameters of nested layers in separate rows. View the first few rows of the learnables table.

learnables = net.Learnables; head(learnables)

      Layer           Parameter           Value       
__________________    _________    ___________________

"conv1"               "Weights"    {3x3x3x64  dlarray}
"conv1"               "Bias"       {1x1x64    dlarray}
"fire2-squeeze1x1"    "Weights"    {1x1x64x16 dlarray}
"fire2-squeeze1x1"    "Bias"       {1x1x16    dlarray}
"fire2-expand1x1"     "Weights"    {1x1x16x64 dlarray}
"fire2-expand1x1"     "Bias"       {1x1x64    dlarray}
"fire2-expand3x3"     "Weights"    {3x3x16x64 dlarray}
"fire2-expand3x3"     "Bias"       {1x1x64    dlarray}

To freeze the learnable parameters of the network, loop over the learnable parameters and set the learn rate to 0 using the setLearnRateFactor function.

factor = 0;

numLearnables = size(learnables,1); for i = 1:numLearnables layerName = learnables.Layer(i); parameterName = learnables.Parameter(i);

net = setLearnRateFactor(net,layerName,parameterName,factor);

end

To use the updated learn rate factors when training, you must pass the dlnetwork object to the update function in the custom training loop. For example, use the command

[net,velocity] = sgdmupdate(net,gradients,velocity);

Input Arguments

collapse all

layer — Input layer

scalar Layer object

Input layer, specified as a scalar Layer object.

parameterName — Parameter name

character vector | string scalar

Parameter name, specified as a character vector or a string scalar.

factor — Learning rate factor

nonnegative scalar

Learning rate factor for the parameter, specified as a nonnegative scalar.

The software multiplies this factor by the global learning rate to determine the learning rate for the specified parameter. For example, iffactor is 2, then the learning rate for the specified parameter is twice the current global learning rate. The software determines the global learning rate based on the settings specified with thetrainingOptions function.

Example: 2

parameterPath — Path to parameter in nested layer

string scalar | character vector

Path to parameter in nested layer, specified as a string scalar or a character vector. A nested layer can be a layer within a networkLayer or a custom layer that itself defines a neural network as a learnable parameter.

If the input to setLearnRateFactor is a layer, then:

If the input to setLearnRateFactor is a dlnetwork object and the desired parameter is in a nested layer, then:

Data Types: char | string

net — Neural network

dlnetwork object

Neural network, specified as a dlnetwork object.

layerName — Layer name

string scalar | character vector

Layer name, specified as a string scalar or a character vector.

Data Types: char | string

Output Arguments

collapse all

layerUpdated — Updated layer

Layer object

Updated layer, returned as a Layer.

netUpdated — Updated network

dlnetwork object

Updated network, returned as a dlnetwork.

Version History

Introduced in R2017b

expand all

R2024a: Specify Path to Parameter in Network Layer

Specify the path to a parameter in a networkLayer using the parameterPath argument.