objectDetectionMetrics - Object detection quality metrics - MATLAB (original) (raw)
Object detection quality metrics
Since R2023b
Description
Use the objectDetectionMetrics
object and its object functions to evaluate the quality of object detection results.
An objectDetectionMetrics
object stores object detection quality metrics, such as the average precision (AP) and precision recall, computed per class and per image. To compute the AP and precision recall metrics, pass the objectDetectionMetrics
object to the averagePrecision or the precisionRecall object functions, respectively. To compute the confusion matrix, pass theobjectDetectionMetrics
object to the confusionMatrix object function. Evaluate the summary of all metrics across all classes and all images in the data set using the summarize object function.
Properties
This property is read-only.
Metrics per class, stored as a table with C rows, where_C_ is the number of classes in the object detection. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, the ClassMetrics
table has five columns, corresponding to these object detection metrics.
NumObjects
— Number of objects in the ground truth data for a class.AP
— Average precision (AP) for each class at each overlap threshold in[OverlapThreshold](objectdetectionmetrics.html#mw%5Feca15534-c978-4775-b162-db632406f5f1)
, stored as a_numThresh_-by-1 array, where numThresh is the number of overlap thresholds.APOverlapAvg
— AP averaged over all overlap thresholds. Specify the overlap thresholds for a class using the threshold argument.Precision
— Precision values, stored as a_numThresh_-by-(numPredictions+1) matrix, where numPredictions is the number of predicted bounding boxes. Precision is the ratio of the number of true positives (TP) and the total number of predicted positives.
Precision = TP / (TP +FP)
FP is the number of false positives. Larger precision scores imply that most detected objects match ground truth objects.Recall
— Recall values, stored as a_numThresh_-by-(numPredictions+1) matrix, where numPredictions is the number of predicted boxes. Recall is the ratio of the number of true positives (TP) and the number of groundtruths – the sum of true positives (TP) and false negatives (FN).
Recall = TP / (TP +FN)
FN is the number of false negatives. Larger recall scores indicate that more of ground truth objects are detected.
Note
For each overlap threshold (row in theRecall
matrix), the recall values (columns of theRecall
matrix) are sorted in the order of decreasing confidence score associated with each detection.
For information on optional additional metrics for this table, see theAdditionalMetrics
argument of the evaluateObjectDetection function.
This property is read-only.
Metrics per image in the data set, stored as a table with_numImages_ rows, where numImages is the number of images in the data set. If additional metrics are not specified through the AdditionalMetrics argument of the evaluateObjectDetection function, the ImageMetrics
table has three columns, corresponding to these object detection metrics.
NumObjects
— Number of objects in the ground truth data in each image, stored as a positive integer.mAP
— Mean average precision (mAP), calculated by averaging the average precision (AP) across all classes at each overlap threshold in the[OverlapThreshold](objectdetectionmetrics.html#mw%5Feca15534-c978-4775-b162-db632406f5f1)
property, stored as a_numThresh_-by-1 numeric vector. numThresh is the number of overlap thresholds. Specify the overlap thresholds for an image using the threshold argument.mAPOverlapAvg
— Mean average precision (mAP), calculated by averaging the AP across all classes and all overlap thresholds specified by the[OverlapThreshold](objectdetectionmetrics.html#mw%5Feca15534-c978-4775-b162-db632406f5f1)
property, stored as a numeric scalar.
For information on optional additional metrics for this table, see theAdditionalMetrics
argument of the evaluateObjectDetection function.
Class names of detected objects, stored as an array of strings or a cell array of character vectors.
Example: {"sky"} {"grass"} {"building"} {"sidewalk"}
Overlap threshold, stored as a numeric scalar or numeric vector of box overlap threshold values over which the mean average precision is computed. When the intersection over union (IoU) of the pixels in the ground truth bounding box and the predicted bounding box is equal to or greater than the overlap threshold, the detection is considered a match to the ground truth (true positive). The IoU is the number of pixels in the intersection of the bounding boxes divided by the number of pixels in the union of the bounding boxes.
Object Functions
averagePrecision | Evaluate average precision metric of object detection results |
---|---|
confusionMatrix | Compute confusion matrix of object detection results |
precisionRecall | Get precision recall metrics of object detection results |
summarize | Summarize object detection performance metrics at data set and class level |
metricsByArea | Evaluate detection performance across object size ranges |
Examples
Load a table containing images and ground truth bounding box labels. The first column contains the images, and the remaining columns contain the labeled bounding boxes.
data = load("vehicleTrainingData.mat"); trainingData = data.vehicleTrainingData;
Set the value of the dataDir
variable as the location where the vehicleTrainingData.mat
file is located. Load the test data into a local vehicle data folder.
dataDir = fullfile(toolboxdir("vision"),"visiondata"); trainingData.imageFilename = fullfile(dataDir,trainingData.imageFilename);
Create an imageDatastore
using the files from the table.
imds = imageDatastore(trainingData.imageFilename);
Create a boxLabelDatastore
using the label columns from the table.
blds = boxLabelDatastore(trainingData(:,2:end));
Load Pretrained Object Detector
Load a pretrained YOLO v2 object detector trained to detect vehicles into the workspace.
vehicleDetector = load("yolov2VehicleDetector.mat"); detector = vehicleDetector.detector;
Evaluate and Plot Object Detection Metrics
Run the detector on the test images. Set the detection threshold to a low value to detect as many objects as possible. This helps you evaluate the detector precision across the full range of recall values.
results = detect(detector,imds,Threshold=0.01);
Use evaluateObjectDetection
to compute metrics for evaluating the performance of an object detector.
metrics = evaluateObjectDetection(results,blds);
Return the precision and recall metrics for the vehicle class using the precisionRecall
object function.
[recall,precision,scores] = precisionRecall(metrics); ap = averagePrecision(metrics);
Plot the precision-recall curve for the vehicle class, the only class in the data set. Compute the average precision (AP) using the averagePrecision
object function.
figure plot(recall{1},precision{1}) grid on title("Average Precision = " + ap); xlabel("Recall"); ylabel("Precision");
Compute the summary of the object detection metrics for the data set using the summarize
object function.
[summaryDataset,summaryClass] = summarize(metrics); summaryDataset
summaryDataset=1×3 table NumObjects mAPOverlapAvg mAP0.5 __________ _____________ _______
336 0.99096 0.99096
Version History
Introduced in R2023b
The ConfusionMatrix
,NormalizedConfusionMatrix
, and DatasetMetrics
properties of the objectDetectionMetrics
object have been removed.
To update your code to compute the confusion matrix, replace instances of theConfusionMatrix
and NormalizedConfusionMatrix
properties with the confusionMatrix object function. For an example, see the "Evaluate Detector Errors Using Confusion Matrix" section of the Multiclass Object Detection Using YOLO v2 Deep Learning example.
To compute the summary of the object detection metrics over the entire data set or over each class, use the summarize object function.
To compute precision, recall, and confidence scores for all classes in the data set, or at specified classes and overlap thresholds, use the precisionRecall object function.
To compute average precision (AP) for all classes and overlap thresholds in the data set, or specify the classes and overlap thresholds for which to compute AP, use theaveragePrecision object function.
These table columns of the ClassMetrics
andImageMetrics
properties have been renamed.
objectDetectionMetrics property | Renamed Columns |
---|---|
ClassMetrics | mAP, or the average precision (AP) averaged over all overlap thresholds for each class, has been renamed toAPOverlapAvg.mLAMR, or the log-average miss rate for each class averaged over all specified overlap thresholds, has been renamed toLAMROverlapAvg.mAOS, or the average orientation similarity for each class averaged over all the specified overlap thresholds, has been renamed to AOSOverlapAvg. |
ImageMetrics | AP, or the AP across all classes at each overlap threshold, has been renamed to mAP.mAP, or the AP averaged across all classes and all overlap thresholds, has been renamed tomAPOverlapAvg.mLAMR, or the log-average miss rate for each class averaged over all specified overlap thresholds, has been renamed toLAMROverlapAvg.mAOS, or the average orientation similarity for each class averaged over all the specified overlap thresholds, has been renamed to AOSOverlapAvg. |