MultiLabelSoftMarginLoss — PyTorch 2.7 documentation (original) (raw)

class torch.nn.MultiLabelSoftMarginLoss(weight=None, size_average=None, reduce=None, reduction='mean')[source][source]

Creates a criterion that optimizes a multi-label one-versus-all loss based on max-entropy, between input xx and target yy of size(N,C)(N, C). For each sample in the minibatch:

loss(x,y)=−1C∗∑iy[i]∗log⁡((1+exp⁡(−x[i]))−1)+(1−y[i])∗log⁡(exp⁡(−x[i])(1+exp⁡(−x[i])))loss(x, y) = - \frac{1}{C} * \sum_i y[i] * \log((1 + \exp(-x[i]))^{-1}) + (1-y[i]) * \log\left(\frac{\exp(-x[i])}{(1 + \exp(-x[i]))}\right)

where i∈{0, ⋯ , x.nElement()−1}i \in \left\{0, \; \cdots , \; \text{x.nElement}() - 1\right\},y[i]∈{0, 1}y[i] \in \left\{0, \; 1\right\}.

Parameters

Shape: