pytorch / 2 / generated / torch.nn.elu.html

ELU

class torch.nn.ELU(alpha=1.0, inplace=False) [source]

Applies the Exponential Linear Unit (ELU) function, element-wise, as described in the paper: Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs).

ELU is defined as:

ELU ( x ) = { x , if x > 0 α ( exp ( x ) 1 ) , if x 0 \text{ELU}(x) = \begin{cases} x, & \text{ if } x > 0\\ \alpha * (\exp(x) - 1), & \text{ if } x \leq 0 \end{cases}
Parameters
  • alpha (float) – the α \alpha value for the ELU formulation. Default: 1.0
  • inplace (bool) – can optionally do the operation in-place. Default: False
Shape:
  • Input: ( ) (*) , where * means any number of dimensions.
  • Output: ( ) (*) , same shape as the input.
../_images/ELU.png

Examples:

>>> m = nn.ELU()
>>> input = torch.randn(2)
>>> output = m(input)

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.ELU.html