MultiLabelMarginLoss — PyTorch 2.7 documentation (original) (raw)

class torch.nn.MultiLabelMarginLoss(size_average=None, reduce=None, reduction='mean')[source][source]

Creates a criterion that optimizes a multi-class multi-classification hinge loss (margin-based loss) between input xx (a 2D mini-batch Tensor) and output yy (which is a 2D Tensor of target class indices). For each sample in the mini-batch:

loss(x,y)=∑ijmax⁡(0,1−(x[y[j]]−x[i]))x.size(0)\text{loss}(x, y) = \sum_{ij}\frac{\max(0, 1 - (x[y[j]] - x[i]))}{\text{x.size}(0)}

where x∈{0, ⋯ , x.size(0)−1}x \in \left\{0, \; \cdots , \; \text{x.size}(0) - 1\right\}, y∈{0, ⋯ , y.size(0)−1}y \in \left\{0, \; \cdots , \; \text{y.size}(0) - 1\right\}, 0≤y[j]≤x.size(0)−10 \leq y[j] \leq \text{x.size}(0)-1, and i≠y[j]i \neq y[j] for all ii and jj.

yy and xx must have the same size.

The criterion only considers a contiguous block of non-negative targets that starts at the front.

This allows for different samples to have variable amounts of target classes.

Parameters

Shape:

Examples:

loss = nn.MultiLabelMarginLoss() x = torch.FloatTensor([[0.1, 0.2, 0.4, 0.8]])

for target y, only consider labels 3 and 0, not after label -1

y = torch.LongTensor([[3, 0, -1, 1]])

0.25 * ((1-(0.1-0.2)) + (1-(0.1-0.4)) + (1-(0.8-0.2)) + (1-(0.8-0.4)))

loss(x, y) tensor(0.85...)