checkLayer - Check validity of custom or function layer - MATLAB (original) (raw)

Check validity of custom or function layer

Syntax

Description

checkLayer([layer](#d126e32492),[layout1,...,layoutN](#mw%5F57f4cfde-35b4-4349-954b-0be22d01af6f)) checks the validity of a layer using the specified networkDataLayout objects, where N is the number of layer inputs and layoutK corresponds to the inputlayer.InputNames(K). (since R2023b)

example

checkLayer([layer](#d126e32492),[validInputSize](#d126e32593)) checks the validity of a custom or function layer using generated data of the sizes in validInputSize. For layers with a single input, setvalidInputSize to a typical size of input data to the layer. For layers with multiple inputs, set validInputSize to a cell array of typical sizes, where each element corresponds to a layer input. This syntax does not support layers that inherit from thennet.layer.Formattable class.

example

checkLayer(___,[Name=Value](#namevaluepairarguments)) specifies additional options using one or more name-value arguments.

example

Examples

collapse all

Check Validity of Custom Flatten Layer

Create a function layer object that applies the flatten function to the layer input. The flatten function is defined at the end of this example and collapses the spatial dimensions of the input dlarray into the channel dimension.

customFlattenLayer = functionLayer(@(X) flatten(X),Formattable=true)

customFlattenLayer = FunctionLayer with properties:

         Name: ''
   PredictFcn: @(X)flatten(X)
  Formattable: 1
Acceleratable: 0

Learnable Parameters No properties.

State Parameters No properties.

Use properties method to see a list of all properties.

Specify the size and dimensions of the inputs to the layer using networkDataLayout objects.

layout = networkDataLayout([227 227 3 NaN],"SSCB")

layout = networkDataLayout with properties:

  Size: [227 227 3 NaN]
Format: 'SSCB'

Check that the layer is valid using the checkLayer function.

checkLayer(customFlattenLayer,layout)

Skipping initialization tests. The layer does not have an initialize function.

Skipping GPU tests. No compatible GPU device found.

Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options.

Running nnet.checklayer.TestLayerWithoutBackward .......... ........ Done nnet.checklayer.TestLayerWithoutBackward


Test Summary: 18 Passed, 0 Failed, 0 Incomplete, 16 Skipped. Time elapsed: 0.40086 seconds.

In this case, the function does not detect any issues with the layer.

Flatten Function

The flatten function receives a formatted dlarray as input and collapses the spatial dimensions of the input dlarray into the channel dimension. The input dlarray must not contain time ("T") or unspecified ("U") dimensions.

function Y = flatten(X) % Find spatial, channel, and batch dimensions. idxS = finddim(X,"S"); idxC = finddim(X,"C"); idxB = finddim(X,"B");

% Determine size of spatial and channel dimensions. sizeS = size(X,idxS); sizeC = size(X,idxC);

if ~isempty(idxB) % If the input has a batch dimension, determine the size of the output % channel dimension. numChannels = sizeC*prod(sizeS,"all"); sizeB = size(X,idxB);

% Reshape and format output in "CB" format.
X = reshape(X,[numChannels sizeB]);
Y = dlarray(X,"CB");

else % If the input does not have a batch dimension, reshape and output in % "CU" format. X = X(:); Y = dlarray(X,"CU"); end

end

Check Custom Layer Validity

Check the validity of the example custom layer sreluLayer.

The custom layer sreluLayer, attached to this example as a supporting file, applies the SReLU operation to the input data. To access this layer, open this example as a live script.

Create an instance of the layer.

Create a networkDataLayout object that specifies the expected input size and format of a single observation of typical input to the layer. Specify a valid input size of [24 24 20 128], where the dimensions correspond to the height, width, number of channels, and number of observations of the previous layer output. Specify the data has format "SSCB" (spatial, spatial, channel, batch).

validInputSize = [24 24 20 128]; layout = networkDataLayout(validInputSize,"SSCB");

Check the layer validity using checkLayer. When you pass data through the network, the layer expects 4-D array inputs, where the first three dimensions correspond to the height, width, and number of channels of the previous layer output, and the fourth dimension corresponds to the observations.

Skipping GPU tests. No compatible GPU device found.

Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options.

Running nnet.checklayer.TestLayerWithoutBackward .......... .......... Done nnet.checklayer.TestLayerWithoutBackward


Test Summary: 20 Passed, 0 Failed, 0 Incomplete, 14 Skipped. Time elapsed: 0.11032 seconds.

The results show the number of passed, failed, and skipped tests. If you do not have a GPU, then the function skips the corresponding tests.

Check Function Layer Validity

Create a function layer object that applies the softsign operation to the input. The softsign operation is given by the function f(x)=x1+|x|.

layer = functionLayer(@(X) X./(1 + abs(X)))

layer = FunctionLayer with properties:

         Name: ''
   PredictFcn: @(X)X./(1+abs(X))
  Formattable: 0
Acceleratable: 0

Learnable Parameters No properties.

State Parameters No properties.

Use properties method to see a list of all properties.

Check that the layer it is valid using the checkLayer function. Set the valid input size to the typical size of a single observation input to the layer. For example, for a single input, the layer expects observations of size _h_-by-_w_-by-c, where h, w, and c are the height, width, and number of channels of the previous layer output, respectively.

Specify validInputSize as the typical size of an input array.

validInputSize = [5 5 20]; checkLayer(layer,validInputSize)

Skipping initialization tests. The layer does not have an initialize function.

Skipping multi-observation tests. To enable tests with multiple observations, specify a formatted networkDataLayout as the second argument or specify the ObservationDimension option. For 2-D image data, set ObservationDimension to 4. For 3-D image data, set ObservationDimension to 5. For sequence data, set ObservationDimension to 2.

Skipping GPU tests. No compatible GPU device found.

Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options.

Running nnet.checklayer.TestLayerWithoutBackward .......... .. Done nnet.checklayer.TestLayerWithoutBackward


Test Summary: 12 Passed, 0 Failed, 0 Incomplete, 22 Skipped. Time elapsed: 0.1614 seconds.

The results show the number of passed, failed, and skipped tests. If you do not specify the ObservationsDimension option, or do not have a GPU, then the function skips the corresponding tests.

Check Multiple Observations

For multi-observation image input, the layer expects an array of observations of size _h_-by-_w_-by-_c_-by-N, where h, w, and c are the height, width, and number of channels, respectively, and N is the number of observations.

To check the layer validity for multiple observations, specify the typical size of an observation and set the ObservationDimension option to 4.

layer = functionLayer(@(X) X./(1 + abs(X))); validInputSize = [5 5 20]; checkLayer(layer,validInputSize,ObservationDimension=4)

Skipping initialization tests. The layer does not have an initialize function.

Skipping GPU tests. No compatible GPU device found.

Skipping code generation compatibility tests. To check validity of the layer for code generation, specify the CheckCodegenCompatibility and ObservationDimension options.

Running nnet.checklayer.TestLayerWithoutBackward .......... ........ Done nnet.checklayer.TestLayerWithoutBackward


Test Summary: 18 Passed, 0 Failed, 0 Incomplete, 16 Skipped. Time elapsed: 0.07352 seconds.

In this case, the function does not detect any issues with the layer.

Check Custom Layer for Code Generation Compatibility

Check the code generation compatibility of the custom layer codegenSReLULayer.

The custom layer codegenSReLULayer, attached to this is example as a supporting file, applies the SReLU operation to the input data. To access this layer, open this example as a live script.

Create an instance of the layer.

layer = codegenSReLULayer;

Create a networkDataLayout object that specifies the expected input size and format of typical input to the layer. Specify a valid input size of [24 24 20 128], where the dimensions correspond to the height, width, number of channels, and number of observations of the previous layer output. Specify the format as "SSCB" (spatial, spatial, channel, batch).

validInputSize = [24 24 20 128]; layout = networkDataLayout(validInputSize,"SSCB");

Check the layer validity using checkLayer. To check for code generation compatibility, set the CheckCodegenCompatibility option to true. The checkLayer function does not check that the layer uses MATLAB functions that are compatible with code generation. To check that the custom layer definition is supported for code generation, first use the Code Generation Readiness app. For more information, see Check Code by Using the Code Generation Readiness Tool (MATLAB Coder).

checkLayer(layer,layout,CheckCodegenCompatibility=true)

Skipping GPU tests. No compatible GPU device found.

Running nnet.checklayer.TestLayerWithoutBackward .......... .......... ..... Done nnet.checklayer.TestLayerWithoutBackward


Test Summary: 25 Passed, 0 Failed, 0 Incomplete, 9 Skipped. Time elapsed: 1.2189 seconds.

The function does not detect any issues with the layer.

Input Arguments

collapse all

layer — Layer to check

nnet.layer.Layer object | FunctionLayer

Layer to check, specified as an nnet.layer.Layer orFunctionLayer object.

If the layer has learnable or state parameters which require initialization before layer can be evaluated, or if the layer has a custom initialize function, then you must specify a layout or the layer must be initialized.

For an example showing how to define your own custom layer, see Define Custom Deep Learning Layer with Learnable Parameters. To create a layer that applies a specified function, use functionLayer.

layout1,...,layoutN — Network data layouts

networkDataLayout object

Since R2023b

Valid network data layouts for each input to the layer, specified asnetworkDataLayout objects.

If the layer inherits from the nnet.layer.Formattable class, you must specify a networkDataLayout for each input to the layer.

For large input sizes, the gradient checks take longer to run. To speed up the check, specify a network data layout with a smaller size using theSize property.

validInputSize — Valid input sizes

vector of positive integers | cell array of vectors of positive integers

Valid input sizes of the layer, specified as a vector of positive integers or cell array of vectors of positive integers.

For more information, see Layer Input Sizes.

For large input sizes, the gradient checks take longer to run. To speed up the check, specify a smaller valid input size.

Example: [5 5 10]

Example: {[24 24 20],[24 24 10]}

Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64 | cell

Name-Value Arguments

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: ObservationDimension=4 sets the observation dimension to 4

ObservationDimension — Observation dimension

positive integer | row vector of positive integers

Observation dimension, specified as a positive integer or row vector of positive integers. The default is the position of the batch ("B") dimensions of the network data layoutslayout1,...,layoutN.

The observation dimension specifies which dimension of the layer input data corresponds to observations. For example, if the layer expects input data is of size_h_-by-_w_-by-c_-by-N, where h, w, and_c correspond to the height, width, and number of channels of the input data, respectively, and N corresponds to the number of observations, then the observation dimension is 4. For more information, see Layer Input Sizes.

If you specify a network data layout with a batch dimension or if you specify the observation dimension, then thecheckLayer function checks that the layer functions are valid using generated data with mini-batches of size 1 and 2. Otherwise, the function skips the corresponding tests.

Example: 4

Example: [4 4 2]

Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

CheckCodegenCompatibility — Flag to enable code generation tests

0 (false) (default) | 1 (true)

Flag to enable code generation tests, specified as0 (false) or 1 (true).

If CheckCodegenCompatibility is1 (true), then you must specify a layout whose Format property includes a batch ("B") dimension or specify the ObservationDimension option.

The CheckCodegenCompatibility option does not support layers that inherit fromnnet.layer.Formattable. Instead, use theanalyzeNetworkForCodegen (MATLAB Coder) function.

In addition, when generating code that uses third-party libraries:

The checkLayer function does not check that functions used by the layer are compatible with code generation. To check that functions used by the custom layer also support code generation, first use the Code Generation Readiness app. For more information, see Check Code by Using the Code Generation Readiness Tool (MATLAB Coder).

For an example showing how to define a custom layer that supports code generation, see Define Custom Deep Learning Layer for Code Generation.

Data Types: logical

More About

collapse all

Layer Input Sizes

For each layer, the valid network data layout depends on the output of the previous layer.

Layer Input Example
Shape Data Format
2-D images _h_-by-_w_-by-_c_-by-N numeric array, where h, w,c and N are the height, width, number of channels of the images, and number of observations, respectively. "SSCB"
3-D images _h_-by-_w_-by-_d_-by-_c_-by-N numeric array, where h, w,d, c and N are the height, width, depth, number of channels of the images, and number of image observations, respectively. "SSSCB"
Vector sequences _c_-by-N_-by-s matrix, where c is the number of features of the sequence, N is the number of sequence observations, and_s is the sequence length. "CBT"
2-D image sequences _h_-by-_w_-by-_c_-by-N_-by-s array, where h, w, and_c correspond to the height, width, and number of channels of the image, respectively, N is the number of image sequence observations, and s is the sequence length. "SSCBT"
3-D image sequences _h_-by-_w_-by-_d_-by-_c_-by-N_-by-s array, where h, w,d, and c correspond to the height, width, depth, and number of channels of the image, respectively,N is the number of image sequence observations, and_s is the sequence length. "SSSCBT"
Features c_-by-N array, where_c is the number of features, and N is the number of observations. "CB"

For example, for 2-D image classification problems, create anetworkDataLayout object specifying the size as [h w c n] and the format as "SSCB", whereh, w, and c correspond to the height, width, and number of channels of the images, respectively, andn corresponds to the number of observations.

Code generation supports layers with 2-D image input only.

Algorithms

collapse all

List of Tests

The checkLayer function uses these tests to check the validity of custom layers.

Test Description
functionSyntaxesAreCorrect The syntaxes of the layer functions are correctly defined.
predictDoesNotError predict function does not error.
forwardDoesNotError When specified, the forward function does not error.
forwardPredictAreConsistentInSize When forward is specified,forward and predict output values of the same size.
backwardDoesNotError When specified, backward does not error.
backwardIsConsistentInSize When backward is specified, the outputs ofbackward are consistent in size: The derivatives with respect to each input are the same size as the corresponding input.The derivatives with respect to each learnable parameter are the same size as the corresponding learnable parameter.
predictIsConsistentInType The outputs of predict are consistent in type with the inputs.
forwardIsConsistentInType When forward is specified, the outputs offorward are consistent in type with the inputs.
backwardIsConsistentInType When backward is specified, the outputs ofbackward are consistent in type with the inputs.
gradientsAreNumericallyCorrect When backward is specified, the gradients computed in backward are consistent with the numerical gradients.
backwardPropagationDoesNotError When backward is not specified, the derivatives can be computed using automatic differentiation.
predictReturnsValidStates For layers with state properties, the predict function returns valid states.
forwardReturnsValidStates For layers with state properties, the forward function, if specified, returns valid states.
resetStateDoesNotError For layers with state properties, the resetState function, if specified, does not error and resets the states to valid states.
formattableLayerPredictIsFormatted (since R2023b) For layers that inherit from thennet.layer.Formattable class, thepredict function returns a formatteddlarray with a channel dimension.
formattableLayerForwardIsFormatted (since R2023b) For layers that inherit from thennet.layer.Formattable class, theforward function, if specified, returns a formatted dlarray with a channel dimension.
initializeDoesNotChangeLearnableParametersWhenTheyAreNotEmpty (since R2023b) When you specify one or more networkDataLayout objects, the learnable parameters of the layer do not change after repeated initialization with the samenetworkDataLayout objects as input.
initializeDoesNotChangeStatefulParametersWhenTheyAreNotEmpty (since R2023b) When you specify one or more networkDataLayout objects, the state parameters of the layer do not change after repeated initialization with the same networkDataLayout objects as input.
codegenPragmaDefinedInClassDef The pragma "%#codegen" for code generation is specified in class file.
layerPropertiesSupportCodegen The layer properties support code generation.
predictSupportsCodegen predict is valid for code generation.
doesNotHaveStateProperties For code generation, the layer does not have state properties.
functionLayerSupportsCodegen For code generation, the layer function must be a named function on the path and the Formattable property must be0 (false).

Some tests run multiple times. These tests also check different data types and for GPU compatibility:

To execute the layer functions on a GPU, the functions must support inputs and outputs of type gpuArray with the underlying data typesingle.

For more information on the tests used by checkLayer, seeCheck Custom Layer Validity.

Version History

Introduced in R2018a

expand all

Custom output layers are not recommended. Use a custom loss function in thetrainnet function instead.

This recommendation means that these syntaxes are not recommended for custom output layer input:

There are no plans to remove support for custom output layers. However, thetrainnet function has these advantages and is recommended instead:

This table shows some typical usages of the trainNetwork function with custom output layers and how to update your code to use thetrainnet function instead.

Not Recommended Recommended
net = trainNetwork(X,T,layers,options), where layers contains a custom output layer. net = trainnet(X,T,layers,lossFcn,options);In this example, layers specifies same network without the custom output layer andlossFcn is a function handle the specifies the custom loss function.
net = trainNetwork(data,layers,options), where layers contains a custom output layer. net = trainnet(data,layers,lossFcn,options);In this example, layers specifies same network without the custom output layer andlossFcn is a function handle the specifies the custom loss function.

R2023b: Check formattable custom layers and check custom layers without initializing

You can now use the checkLayer function to check the validity of custom layers that inherit from the nnet.layer.Formattable class by specifying anetworkDataLayoutObject as the second argument.

You can also check the validity of a custom layer with a custom initialize function without first initializing the layer by using thecheckLayer function and specifying a networkDataLayout as the second argument.