ELU — PyTorch 2.7 documentation (original) (raw)

class torch.nn.ELU(alpha=1.0, inplace=False)[source][source]

Applies the Exponential Linear Unit (ELU) function, element-wise.

Method described in the paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs).

ELU is defined as:

ELU(x)={x, if x>0α∗(exp⁡(x)−1), if x≤0\text{ELU}(x) = \begin{cases} x, & \text{ if } x > 0\\ \alpha * (\exp(x) - 1), & \text{ if } x \leq 0 \end{cases}

Parameters

Shape:

../_images/ELU.png

Examples:

m = nn.ELU() input = torch.randn(2) output = m(input)