Deep Learning Metrics - MATLAB & Simulink (original) (raw)

Main Content

Use metrics to assess the performance of your deep learning model during and after training.

To specify which metrics to use during training, specify the Metrics option of the trainingOptions function. You can use this option only when you train a network using the trainnet function.

To plot the metrics during training, in the training options, specify Plots as "training-progress". If you specify theValidationData training option, then the software also plots and records the metric values for the validation data. To output the metric values to the Command Window during training, in the training options, set Verbose to true.

You can also access the metrics after training using theTrainingHistory and ValidationHistory fields from the second output of the trainnet function.

To specify which metrics to use when you test a neural network, use the metrics argument of the testnet function.

You can specify metrics using their built-in name, specified as a string input to thetrainingOptions or testnet functions. For example, use this command.

metricValues = testnet(net,data,["accuracy","fscore"]);

If you require greater customization, then you can use metric objects and functions to specify additional options.

For example, use these commands.

customAccuracy = accuracyMetric(NumTopKClasses=5,AverageType="macro"); customCrossEntropy = @(Y,T)crossentropy(X,T,Mask=customMask); metricValues = testnet(net,data,{customAccuracy,"fscore",customCrossEntropy});

If there is no object or function for metric that you need for your task, then you can create a custom metric using a function or class. For more information, see Custom Metrics.

Classification Metrics

This table compares metrics for classification tasks. The equations include these variables:

Regression Metrics

This table compares metrics for regression tasks. The equations include these variables:

Deep Learning Regression Metrics

Name Description Use Case Range Equation Built-in Name Equivalent Object or Function
Root mean squared error (RMSE) Magnitude of the errors between the predicted and true values A general measure of model performance, expressed in the same units as the data. It can be sensitive to outliers. ≥ 0 Perfect model: 0 RMSE = 1N∑i=1n|Yi−Ti 2, "rmse"
Mean absolute percentage error (MAPE) Percentage magnitude of the errors between the predicted and true values Returns a percentage, making it is an intuitive performance measure that is easy to compare across models, though it may perform poorly when target values are near zero. ≥ 0 Perfect model: 0 MAPE =1N∑i=1n|Ti−YiTi , "mape"
R2, also known as the coefficient of determination Measure of how well the predictions explain the variance in the true values A unitless measure of performance that is easy to compare across different models and data sets. ≤ 1 Perfect model: 1 R2=1−∑i=1n(Yi−Ti)2∑i=1n(Ti−T¯)2, where T¯=1n∑i=1nTi "rsquared" RSquaredMetricType: Object
Mean absolute error (MAE), also known as L1 loss Magnitude of the errors between the predicted and true values Provides an understanding of the average error. It is robust to outliers and expressed in the same units as the data. ≥ 0 Perfect model: 0 MAE=1N∑i=1n|Yi−Ti "mae" / "mean-absolute-error" / "l1loss"
Mean squared error (MSE), also known as L2 loss Squared difference between the predicted and true values A general measure of model performance that penalizes outliers more, making it suitable for applications where outliers are costly. ≥ 0 Perfect model: 0 MSE=1N∑i=1n(Yi−Ti)2 "mse" / "mean-squared-error" / "l2loss" l2loss with NormalizationFactor set to "all-elements"Type: Function
Huber Combination of MSE and MAE Balances sensitivity to outliers with robust error measurement, making it suitable for data sets with some outliers. ≥ 0 Perfect model: 0 Huberi={12(Yi−Ti)2if |Yi−Ti ≤1 Yi−Ti

Custom Metrics

If Deep Learning Toolbox™ does not provide the metric that you need for your task, then in many cases you can create a custom metric using a function. After you define the metric function, you can specify the metric as the Metrics name-value argument in the trainingOptions function. For more information, see Define Custom Metric Function.

Early stopping and returning the best network is not supported for custom metric functions. If you require early stopping or retuning the best network, then you must create a custom metric object instead. For more information, see Define Custom Deep Learning Metric Object.

See Also

trainnet | testnet | trainingOptions | dlnetwork | accuracyMetric | aucMetric | fScoreMetric | precisionMetric | recallMetric | rmseMetric | mapeMetric | rSquaredMetric

Topics