Custom Training Loops - MATLAB & Simulink (original ) (raw )
Main Content
Customize deep learning training loops and loss functions
If the trainingOptions function does not provide the training options that you need for your task, or custom output layers do not support the loss functions that you need, then you can define a custom training loop. For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks .
Functionsexpand all
Network Building
plot
Plot neural network architecture
summary
Print network summary (Since R2022b)
analyzeNetwork
Analyze deep learning network architecture
checkLayer
Check validity of custom or function layer
isequal
Check equality of neural networks (Since R2021a)
isequaln
Check equality of neural networks ignoring NaN values (Since R2021a)
Custom Training Loops
forward
Compute deep learning network output for training
predict
Compute deep learning network output for inference
adamupdate
Update parameters using adaptive moment estimation (Adam)
rmspropupdate
Update parameters using root mean squared propagation (RMSProp)
sgdmupdate
Update parameters using stochastic gradient descent with momentum (SGDM)
lbfgsupdate
Update parameters using limited-memory BFGS (L-BFGS) (Since R2023a)
lbfgsState
State of limited-memory BFGS (L-BFGS) solver (Since R2023a)
dlupdate
Update parameters using custom function
trainingProgressMonitor
Monitor and plot training progress for deep learning custom training loops (Since R2022b)
updateInfo
Update information values for custom training loops (Since R2022b)
recordMetrics
Record metric values for custom training loops (Since R2022b)
groupSubPlot
Group metrics in training plot (Since R2022b)
deep.gpu.deterministicAlgorithms
Set determinism of deep learning operations on the GPU to get reproducible results (Since R2024b)
Data Processing
padsequences
Pad or truncate sequence data to same length (Since R2021a)
minibatchqueue
Create mini-batches for deep learning (Since R2020b)
onehotencode
Encode data labels into one-hot vectors (Since R2020b)
onehotdecode
Decode probability vectors into class labels (Since R2020b)
next
Obtain next mini-batch of data from minibatchqueue (Since R2020b)
reset
Reset minibatchqueue to start of data (Since R2020b)
shuffle
Shuffle data in minibatchqueue (Since R2020b)
hasdata
Determine if minibatchqueue can return mini-batch (Since R2020b)
partition
Partition minibatchqueue (Since R2020b)
Automatic Differentiation
dlarray
Deep learning array for customization
dlgradient
Compute gradients for custom training loops using automatic differentiation
dljacobian
Jacobian matrix deep learning operation (Since R2024b)
dldivergence
Divergence of deep learning data (Since R2024b)
dllaplacian
Laplacian of deep learning data (Since R2024b)
dlfeval
Evaluate deep learning model for custom training loops
dims
Data format of dlarray object
finddim
Find dimensions with specified label
stripdims
Remove dlarray data format
extractdata
Extract data from dlarray
isdlarray
Check if object is dlarray (Since R2020b)
Loss Operations
crossentropy
Cross-entropy loss for classification tasks
indexcrossentropy
Index cross-entropy loss for classification tasks (Since R2024b)
l1loss
L1 loss for regression tasks (Since R2021b)
l2loss
L2 loss for regression tasks (Since R2021b)
huber
Huber loss for regression tasks (Since R2021a)
ctc
Connectionist temporal classification (CTC) loss for unaligned sequence classification (Since R2021a)
mse
Half mean squared error
Function Acceleration
dlaccelerate
Accelerate deep learning function for custom training loops (Since R2021a)
AcceleratedFunction
Accelerated deep learning function (Since R2021a)
clearCache
Clear accelerated deep learning function trace cache (Since R2021a)
Topics Custom Training Loops
Automatic Differentiation
Deep Learning Function Acceleration
Featured Examples