torch.nn.functional.softmax — PyTorch 2.7 documentation (original) (raw)

torch.nn.functional.softmax(input, dim=None, _stacklevel=3, dtype=None)[source][source]

Apply a softmax function.

Softmax is defined as:

Softmax(xi)=exp⁡(xi)∑jexp⁡(xj)\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}

It is applied to all slices along dim, and will re-scale them so that the elements lie in the range [0, 1] and sum to 1.

See Softmax for more details.

Parameters

Return type

Tensor

Note

This function doesn’t work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use log_softmax instead (it’s faster and has better numerical properties).