kfoldMargin - Classification margins for cross-validated kernel ECOC model - MATLAB (original) (raw)

Classification margins for cross-validated kernel ECOC model

Syntax

Description

[margin](#mw%5Ff31b3d71-e036-4944-9247-b4a902d49834) = kfoldMargin([CVMdl](#mw%5F4851a19f-6d26-405c-9928-15eb9aba50f5%5Fsep%5Fshared-CVMdl),[Name,Value](#namevaluepairarguments)) returns classification margins with additional options specified by one or more name-value pair arguments. For example, specify the binary learner loss function, decoding scheme, or verbosity level.

Examples

collapse all

Load Fisher's iris data set. X contains flower measurements, and Y contains the names of flower species.

load fisheriris X = meas; Y = species;

Cross-validate an ECOC model composed of kernel binary learners.

CVMdl = fitcecoc(X,Y,'Learners','kernel','CrossVal','on')

CVMdl = ClassificationPartitionedKernelECOC CrossValidatedModel: 'KernelECOC' ResponseName: 'Y' NumObservations: 150 KFold: 10 Partition: [1×1 cvpartition] ClassNames: {'setosa' 'versicolor' 'virginica'} ScoreTransform: 'none'

Properties, Methods

CVMdl is a ClassificationPartitionedKernelECOC model. By default, the software implements 10-fold cross-validation. To specify a different number of folds, use the 'KFold' name-value pair argument instead of 'Crossval'.

Estimate the classification margins for validation-fold observations.

m = kfoldMargin(CVMdl); size(m)

m is a 150-by-1 vector. m(j) is the classification margin for observation j.

Plot the k-fold margins using a boxplot.

boxplot(m,'Labels','All Observations') title('Distribution of Margins')

Figure contains an axes object. The axes object with title Distribution of Margins contains 7 objects of type line. One or more of the lines displays its values using only markers

Perform feature selection by comparing _k_-fold margins from multiple models. Based solely on this criterion, the classifier with the greatest margins is the best classifier.

Load Fisher's iris data set. X contains flower measurements, and Y contains the names of flower species.

load fisheriris X = meas; Y = species;

Randomly choose half of the predictor variables.

rng(1); % For reproducibility p = size(X,2); % Number of predictors idxPart = randsample(p,ceil(0.5*p));

Cross-validate two ECOC models composed of kernel classification models: one that uses all of the predictors, and one that uses half of the predictors.

CVMdl = fitcecoc(X,Y,'Learners','kernel','CrossVal','on'); PCVMdl = fitcecoc(X(:,idxPart),Y,'Learners','kernel','CrossVal','on');

CVMdl and PCVMdl are ClassificationPartitionedKernelECOC models. By default, the software implements 10-fold cross-validation. To specify a different number of folds, use the 'KFold' name-value pair argument instead of 'Crossval'.

Estimate the _k_-fold margins for each classifier.

fullMargins = kfoldMargin(CVMdl); partMargins = kfoldMargin(PCVMdl);

Plot the distribution of the margin sets using box plots.

boxplot([fullMargins partMargins], ... 'Labels',{'All Predictors','Half of the Predictors'}); title('Distribution of Margins')

Figure contains an axes object. The axes object with title Distribution of Margins contains 14 objects of type line. One or more of the lines displays its values using only markers

The PCVMdl margin distribution is similar to the CVMdl margin distribution.

Input Arguments

Name-Value Arguments

collapse all

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: kfoldMargin(CVMdl,'Verbose',1) specifies to display diagnostic messages in the Command Window.

Data Types: char | string | function_handle

Decoding scheme that aggregates the binary losses, specified as the comma-separated pair consisting of 'Decoding' and'lossweighted' or 'lossbased'. For more information, see Binary Loss.

Example: 'Decoding','lossbased'

Data Types: single | double

Output Arguments

collapse all

Classification margins, returned as a numeric vector. margin is an_n_-by-1 vector, where each row is the margin of the corresponding observation and n is the number of observations (size(CVMdl.Y,1)).

More About

collapse all

The classification margin is, for each observation, the difference between the negative loss for the true class and the maximal negative loss among the false classes. If the margins are on the same scale, then they serve as a classification confidence measure. Among multiple classifiers, those that yield greater margins are better.

The binary loss is a function of the class and classification score that determines how well a binary learner classifies an observation into the class. The decoding scheme of an ECOC model specifies how the software aggregates the binary losses and determines the predicted class for each observation.

Assume the following:

The software supports two decoding schemes:

The predict, resubPredict, andkfoldPredict functions return the negated value of the objective function of argmin as the second output argument (NegLoss) for each observation and class.

This table summarizes the supported binary loss functions, where_yj_ is a class label for a particular binary learner (in the set {–1,1,0}), sj is the score for observation j, and_g_(yj,sj) is the binary loss function.

Value Description Score Domain g(yj,sj)
"binodeviance" Binomial deviance (–∞,∞) log[1 + exp(–2_yjsj_)]/[2log(2)]
"exponential" Exponential (–∞,∞) exp(–yjsj)/2
"hamming" Hamming [0,1] or (–∞,∞) [1 – sign(yjsj)]/2
"hinge" Hinge (–∞,∞) max(0,1 – yjsj)/2
"linear" Linear (–∞,∞) (1 – yjsj)/2
"logit" Logistic (–∞,∞) log[1 + exp(–yjsj)]/[2log(2)]
"quadratic" Quadratic [0,1] [1 – yj(2_sj_ – 1)]2/2

The software normalizes binary losses so that the loss is 0.5 when_yj_ = 0, and aggregates using the average of the binary learners [1].

Do not confuse the binary loss with the overall classification loss (specified by theLossFun name-value argument of the kfoldLoss andkfoldPredict object functions), which measures how well an ECOC classifier performs as a whole.

References

[1] Allwein, E., R. Schapire, and Y. Singer. “Reducing multiclass to binary: A unifying approach for margin classifiers.” Journal of Machine Learning Research. Vol. 1, 2000, pp. 113–141.

[2] Escalera, S., O. Pujol, and P. Radeva. “Separability of ternary codes for sparse designs of error-correcting output codes.”Pattern Recog. Lett. Vol. 30, Issue 3, 2009, pp. 285–297.

[3] Escalera, S., O. Pujol, and P. Radeva. “On the decoding process in ternary error-correcting output codes.” IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. 32, Issue 7, 2010, pp. 120–134.

Extended Capabilities

expand all

To run in parallel, specify the Options name-value argument in the call to this function and set the UseParallel field of the options structure to true usingstatset:

Options=statset(UseParallel=true)

For more information about parallel computing, see Run MATLAB Functions with Automatic Parallel Support (Parallel Computing Toolbox).

Version History

Introduced in R2018b

expand all

Starting in R2023b, the following classification model object functions use observations with missing predictor values as part of resubstitution ("resub") and cross-validation ("kfold") computations for classification edges, losses, margins, and predictions.

Model Type Model Objects Object Functions
Discriminant analysis classification model ClassificationDiscriminant resubEdge, resubLoss, resubMargin, resubPredict
ClassificationPartitionedModel kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
Ensemble of discriminant analysis learners for classification ClassificationEnsemble resubEdge, resubLoss, resubMargin, resubPredict
ClassificationPartitionedEnsemble kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
Gaussian kernel classification model ClassificationPartitionedKernel kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
ClassificationPartitionedKernelECOC kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
Linear classification model ClassificationPartitionedLinear kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
ClassificationPartitionedLinearECOC kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
Neural network classification model ClassificationNeuralNetwork resubEdge, resubLoss, resubMargin, resubPredict
ClassificationPartitionedModel kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict
Support vector machine (SVM) classification model ClassificationSVM resubEdge, resubLoss, resubMargin, resubPredict
ClassificationPartitionedModel kfoldEdge, kfoldLoss, kfoldMargin, kfoldPredict

In previous releases, the software omitted observations with missing predictor values from the resubstitution and cross-validation computations.