averagePrecision - Evaluate average precision metric of object detection results - MATLAB (original) (raw)
Main Content
Evaluate average precision metric of object detection results
Since R2024b
Syntax
Description
[ap](#mw%5F8a00cfb9-36cd-4802-a557-222e94696533) = averagePrecision([metrics](#mw%5Fac9e7edb-f020-40db-a4b3-91534ec69479))
evaluates the average precision (AP) for all classes and overlap thresholds ofmetrics
.
AP aggregates the precision across different recall levels, providing a single metric to assess the overall ability of an object detector to identify objects accurately while minimizing false detections.
[ap](#mw%5F8a00cfb9-36cd-4802-a557-222e94696533) = averagePrecision([metrics](#mw%5Fac9e7edb-f020-40db-a4b3-91534ec69479),[Name=Value](#namevaluepairarguments))
specifies options for the average precision evaluation using one or more name-value arguments. For example, ClassNames=["cars" "people"]
specifies to evaluate the average precision metric for the cars and people classes.
Examples
Load a table containing image filenames and ground truth bounding box labels into the workspace. The first column contains the images, and the remaining columns contain the labeled bounding boxes.
data = load("vehicleTrainingData.mat"); trainingData = data.vehicleTrainingData;
Set the value of the dataDir
variable as the location where the vehicleTrainingData.mat
file is located. Load the test data into a local vehicle data folder.
dataDir = fullfile(toolboxdir("vision"),"visiondata"); trainingData.imageFilename = fullfile(dataDir,trainingData.imageFilename);
Create an imageDatastore
using the files from the table.
imds = imageDatastore(trainingData.imageFilename);
Create a boxLabelDatastore
using the label column from the table.
blds = boxLabelDatastore(trainingData(:,2:end));
Load a pretrained YOLO v2 object detector, trained to detect vehicles, into the workspace.
vehicleDetector = load("yolov2VehicleDetector.mat"); detector = vehicleDetector.detector;
Compute Average Precision Metric
Run the detector on the test images. Set the detection threshold to a low value to detect as many objects as possible. This enables you to evaluate the detector precision across the full range of recall values.
results = detect(detector,imds,Threshold=0.01);
Compute the metrics for evaluating the performance of the object detector.
metrics = evaluateObjectDetection(results,blds);
Evaluate the average precision for all classes. The high AP across all classes indicates that the object detection model demonstrates near-perfect precision in identifying objects correctly, while also achieving comprehensive recall, missing very few actual objects across all evaluated classes.
ap = averagePrecision(metrics)
Input Arguments
Name-Value Arguments
Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN
, where Name
is the argument name and Value
is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.
Example: averagePrecision(metrics,ClassNames=[cars people])
specifies to evaluate the average precision metric for the cars and people classes.
Class names of detected objects, specified as an array of strings or a cell array of character vectors. By default, theaveragePrecision
function returns the average precision metrics for all classes specified by the [ClassNames](objectdetectionmetrics.html#b)
property of the objectDetectionMetrics objectmetrics
.
Overlap threshold to use for evaluating the average precision, specified as a numeric scalar or numeric vector of box overlap threshold values. To evaluate multiple overlap thresholds, specify this argument as a numeric vector. By default, theaveragePrecision
object function returns the average precision metrics for all overlap thresholds specified by theOverlapThreshold
property of the objectDetectionMetrics objectmetrics
.
Output Arguments
Average precision (AP) for specified classes and overlap thresholds, specified as an M_-by-N matrix.M is the number of classes in the ClassNames property and_N is the number of specified overlap thresholdsOverlapThreshold.
The AP metric evaluates object detection performance by quantifying the accuracy of the model in identifying objects across different confidence thresholds, enabling you to assess both the precision (correctness of detections) and recall (completeness of detections).
Version History
Introduced in R2024b