CELU — PyTorch 2.7 documentation (original) (raw)

class torch.nn.CELU(alpha=1.0, inplace=False)[source][source]

Applies the CELU function element-wise.

CELU(x)=max⁡(0,x)+min⁡(0,α∗(exp⁡(x/α)−1))\text{CELU}(x) = \max(0,x) + \min(0, \alpha * (\exp(x/\alpha) - 1))

More details can be found in the paper Continuously Differentiable Exponential Linear Units .

Parameters

Shape:

../_images/CELU.png

Examples:

m = nn.CELU() input = torch.randn(2) output = m(input)