CELU — PyTorch 2.7 documentation (original) (raw)
class torch.nn.CELU(alpha=1.0, inplace=False)[source][source]¶
Applies the CELU function element-wise.
CELU(x)=max(0,x)+min(0,α∗(exp(x/α)−1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1))
More details can be found in the paper Continuously Differentiable Exponential Linear Units .
Parameters
- alpha (float) – the α\alpha value for the CELU formulation. Default: 1.0
- inplace (bool) – can optionally do the operation in-place. Default:
False
Shape:
- Input: (∗)(*), where ∗* means any number of dimensions.
- Output: (∗)(*), same shape as the input.
Examples:
m = nn.CELU() input = torch.randn(2) output = m(input)