LayerGraph - (Not recommended) Graph of network layers for deep learning - MATLAB (original) (raw)

(Not recommended) Graph of network layers for deep learning

Description

A layer graph specifies the architecture of a neural network as a directed acyclic graph (DAG) of deep learning layers. The layers can have multiple inputs and multiple outputs.

Creation

Syntax

Description

`lgraph` = layerGraph creates an empty layer graph that contains no layers. You can add layers to the empty graph by using the addLayers function.

example

`lgraph` = layerGraph(`layers`) creates a layer graph from an array of network layers and sets the Layers property. The layers inlgraph are connected in the same sequential order as inlayers.

`lgraph` = layerGraph([net](#mw%5F4f20c960-e06a-4088-83d0-72e6e3a1cb6f)) extracts the layer graph of a SeriesNetwork,DAGNetwork, or dlnetwork object. For example, you can extract the layer graph of a pretrained network to perform transfer learning.

Properties

expand all

Layers — Network layers

Layer array

This property is read-only.

Network layers, specified as a Layer array.

Connections — Layer connections

table

This property is read-only.

Layer connections, specified as a table with two columns.

Each table row represents a connection in the layer graph. The first column,Source, specifies the source of each connection. The second column, Destination, specifies the destination of each connection. The connection sources and destinations are either layer names or have the form"layerName/IOName", where "IOName" is the name of the layer input or output.

Data Types: table

InputNames — Names of input layers

cell array of character vectors

This property is read-only.

Names of the input layers, specified as a cell array of character vectors.

Data Types: cell

OutputNames — Names of output layers

cell array

This property is read-only.

Names of the output layers, specified as a cell array of character vectors.

Data Types: cell

Object Functions

Examples

collapse all

Create Layer Graph

Create a simple layer graph for deep learning.

The simple network in this example consists of:

Create the main branch of the network as a layer array. The addition layer sums multiple inputs element-wise. Specify the number of inputs for the addition layer to sum. To easily add connections later, specify names for the first ReLU layer and the addition layer.

layers = [ imageInputLayer([28 28 1])

convolution2dLayer(5,16,'Padding','same')
batchNormalizationLayer
reluLayer('Name','relu_1')

convolution2dLayer(3,32,'Padding','same','Stride',2)
batchNormalizationLayer
reluLayer
convolution2dLayer(3,32,'Padding','same')
batchNormalizationLayer
reluLayer

additionLayer(2,'Name','add')

averagePooling2dLayer(2,'Stride',2)
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];

Create a layer graph from the layer array. layerGraph connects all the layers in layers sequentially. Plot the layer graph.

lgraph = layerGraph(layers); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create the 1-by-1 convolutional layer and add it to the layer graph. Specify the number of convolutional filters and the stride so that the activation size matches the activation size of the third ReLU layer. This arrangement enables the addition layer to add the outputs of the third ReLU layer and the 1-by-1 convolutional layer. To check that the layer is in the graph, plot the layer graph.

skipConv = convolution2dLayer(1,32,'Stride',2,'Name','skipConv'); lgraph = addLayers(lgraph,skipConv); figure plot(lgraph)

Figure contains an axes object. The axes object contains an object of type graphplot.

Create the shortcut connection from the 'relu_1' layer to the 'add' layer. Because you specified two as the number of inputs to the addition layer when you created it, the layer has two inputs named 'in1' and 'in2'. The third ReLU layer is already connected to the 'in1' input. Connect the 'relu_1' layer to the'skipConv' layer and the'skipConv' layer to the 'in2' input of the 'add' layer. The addition layer now sums the outputs of the third ReLU layer and the 'skipConv' layer. To check that the layers are connected correctly, plot the layer graph.

lgraph = connectLayers(lgraph,'relu_1','skipConv'); lgraph = connectLayers(lgraph,'skipConv','add/in2'); figure plot(lgraph);

Figure contains an axes object. The axes object contains an object of type graphplot.

Limitations

Version History

Introduced in R2017b

collapse all

Starting in R2024a, LayerGraph objects are not recommended, usedlnetwork objects instead.

There are no plans to remove support for LayerGraph objects. However, dlnetwork objects have these advantages and are recommended instead:

Most functions that support LayerGraph objects also supportdlnetwork objects. This table shows some typical usages ofLayerGraph objects and how to update your code to usedlnetwork object functions instead.

Not Recommended Recommended
lgraph = layerGraph; net = dlnetwork;
lgraph = layerGraph(layers); net = dlnetwork(layers,Initialize=false);
lgraph = layerGraph(net); net = dag2dlnetwork(net);
lgraph = addLayers(lgraph,layers); net = addLayers(net,layers);
lgraph = removeLayers(lgraph,layerNames); net = removeLayers(net,layerNames);
lgraph = replaceLayer(lgraph,layerName,layers); net = replaceLayer(net,layerName,layers);
lgraph = connectLayers(lgraph,s,d); net = connectLayers(net,s,d);
lgraph = disconnectLayers(lgraph,s,d); net = disconnectLayers(net,s,d);
plot(lgraph); plot(net);

To train a neural network specified as a dlnetwork object, use the trainnet function.

In a LayerGraph object, instead of using an output layer, specify the loss function using the loss function argument of thetrainnet function.