torch.nn.functional.binary_cross_entropy — PyTorch 2.7 documentation (original) (raw)
torch.nn.functional.binary_cross_entropy(input, target, weight=None, size_average=None, reduce=None, reduction='mean')[source][source]¶
Measure Binary Cross Entropy between the target and input probabilities.
See BCELoss for details.
Parameters
- input (Tensor) – Tensor of arbitrary shape as probabilities.
- target (Tensor) – Tensor of the same shape as input with values between 0 and 1.
- weight (Tensor, optional) – a manual rescaling weight if provided it’s repeated to match input tensor shape
- size_average (bool, optional) – Deprecated (see
reduction
). By default, the losses are averaged over each loss element in the batch. Note that for some losses, there multiple elements per sample. If the fieldsize_average
is set toFalse
, the losses are instead summed for each minibatch. Ignored when reduce isFalse
. Default:True
- reduce (bool, optional) – Deprecated (see
reduction
). By default, the losses are averaged or summed over observations for each minibatch depending onsize_average
. Whenreduce
isFalse
, returns a loss per batch element instead and ignoressize_average
. Default:True
- reduction (str, optional) – Specifies the reduction to apply to the output:
'none'
|'mean'
|'sum'
.'none'
: no reduction will be applied,'mean'
: the sum of the output will be divided by the number of elements in the output,'sum'
: the output will be summed. Note:size_average
andreduce
are in the process of being deprecated, and in the meantime, specifying either of those two args will overridereduction
. Default:'mean'
Return type
Examples:
input = torch.randn(3, 2, requires_grad=True) target = torch.rand(3, 2, requires_grad=False) loss = F.binary_cross_entropy(torch.sigmoid(input), target) loss.backward()