SoftMarginLoss — PyTorch 2.7 documentation (original) (raw)

class torch.nn.SoftMarginLoss(size_average=None, reduce=None, reduction='mean')[source][source]

Creates a criterion that optimizes a two-class classification logistic loss between input tensor xx and target tensor yy(containing 1 or -1).

loss(x,y)=∑ilog⁡(1+exp⁡(−y[i]∗x[i]))x.nelement()\text{loss}(x, y) = \sum_i \frac{\log(1 + \exp(-y[i]*x[i]))}{\text{x.nelement}()}

Parameters

Shape: