Networks and Layers Supported for Code Generation - MATLAB & Simulink (original) (raw)

MATLAB® Coder™ supports code generation for dlnetwork (Deep Learning Toolbox), series, and directed acyclic graph (DAG) networks. You can generate code for any trained neural network that uses supported deep learning networks, layers and classes.

Supported Pretrained Networks

These pretrained networks, available in Deep Learning Toolbox™, are supported for code generation. You can use imagePretrainedNetwork (Deep Learning Toolbox) function to load these pretrained neural networks as dlnetwork (Deep Learning Toolbox) objects for code generation. Note that code generation does not support setting the name-value argumentWeights of imagePretrainedNetwork function to"none". For example, use this code to load a pretrained GoogLeNet neural network.

net = imagePretrainedNetwork("googlenet")

You can use analyzeNetworkForCodegen to see if a network is compatible for code generation for a specific deep learning library. For example:

result = analyzeNetworkForCodegen(imagePretrainedNetwork("googlenet"),TargetLibrary='none')

imagePretrainedNetwork Model Name Argument Generic C/C++ Intel® MKL-DNN and ARM® Compute Libraries
"alexnet"
"darknet19"
"darknet53"
"densenet201"
"efficientnetb0"
"googlenet"
"inceptionresnetv2"
"inceptionv3"
"mobilenetv2"
"nasnetlarge"
"nasnetmobile"
"resnet18"
"resnet50"
"resnet101"
"shufflenet" NoteThe ARM Compute library is not supported forshufflenet.
"squeezenet"
"vgg16"
"vgg19"
"xception"

Supported Layers

These layers are supported for code generation for the target deep learning libraries specified in the table.

Note

In the following tables, the information icon indicates that the network, layer, or class has limited code generation capabilities. You might see errors and unexpected behavior. For more information, see the Extended Capabilities section on the page for that network, layer, or class.

Input Layers

Convolution and Fully Connected Layers

Sequence Layers

Activation Layers

Normalization Layers

Utility Layers

Resizing Layers

Layer Generic C/C++ Intel MKL-DNN andARM Compute Libraries
resize2dLayer (Image Processing Toolbox)

Pooling and Unpooling Layers

Combination Layers

Transformer Layers

Compute Vision and Image Processing Layers

Custom Layers

Layer Generic C/C++ Intel MKL-DNN andARM Compute Libraries
Custom layers More informationCustom layers are layers, with or without learnable parameters, that you define for your problem.See: Define Custom Deep Learning Layers (Deep Learning Toolbox)Define Custom Deep Learning Layer for Code Generation (Deep Learning Toolbox)Networks and Layers Supported for Code Generation Code generation limitationsThe outputs of the custom layer must be fixed-size arrays.Custom layers in sequence networks are supported for generic C/C++ code generation only.For code generation, custom layers must contain the%#codegen pragma.You can pass dlarray to custom layers if: The custom layer is indlnetwork.Custom layer is in a DAG or series network and either inherits fromnnet.layer.Formattable or has no backward propagation.For unsupported dlarray methods, then you must extract the underlying data from thedlarray, perform the computations and reconstruct the data back into thedlarray for code generation. For example,function Z = predict(layer, X) if coder.target('MATLAB') Z = doPredict(X); else if isdlarray(X) X1 = extractdata(X); Z1 = doPredict(X1); Z = dlarray(Z1); else Z = doPredict(X); end end end
Custom Output Layer (Deep Learning Toolbox) More informationAn output layer including custom classification or regression output layers created bynnet.layer.ClassificationLayer ornnet.layer.RegressionLayer functions.For an example showing how to define a custom classification output layer and specify a loss function, see Define Custom Classification Output Layer (Deep Learning Toolbox).For an example showing how to define a custom regression output layer and specify a loss function, seeDefine Custom Regression Output Layer (Deep Learning Toolbox).

Custom Keras Layers

Layer Generic C/C++ Intel MKL-DNN andARM Compute Libraries
nnet.keras.layer.ClipLayer (Deep Learning Toolbox)
nnet.keras.layer.FlattenCStyleLayer (Deep Learning Toolbox)
nnet.keras.layer.GlobalAveragePooling2dLayer (Deep Learning Toolbox)
nnet.keras.layer.PreluLayer (Deep Learning Toolbox)
nnet.keras.layer.SigmoidLayer (Deep Learning Toolbox)
nnet.keras.layer.TanhLayer (Deep Learning Toolbox)
nnet.keras.layer.TimeDistributedFlattenCStyleLayer (Deep Learning Toolbox)
nnet.keras.layer.ZeroPadding2dLayer (Deep Learning Toolbox)

Custom ONNX Layers

Layer Generic C/C++ Intel MKL-DNN andARM Compute Libraries
nnet.onnx.layer.ClipLayer (Deep Learning Toolbox)
nnet.onnx.layer.ElementwiseAffineLayer (Deep Learning Toolbox)
nnet.onnx.layer.FlattenInto2dLayer (Deep Learning Toolbox)
nnet.onnx.layer.FlattenLayer (Deep Learning Toolbox)
nnet.onnx.layer.GlobalAveragePooling2dLayer (Deep Learning Toolbox)
nnet.onnx.layer.IdentityLayer (Deep Learning Toolbox)
nnet.onnx.layer.PreluLayer (Deep Learning Toolbox)
nnet.onnx.layer.SigmoidLayer (Deep Learning Toolbox)
nnet.onnx.layer.TanhLayer (Deep Learning Toolbox)
nnet.onnx.layer.VerifyBatchSizeLayer (Deep Learning Toolbox)

Supported Classes

See Also

imagePretrainedNetwork (Deep Learning Toolbox) | analyzeNetworkForCodegen