pytorch / 2 / generated / torch.nn.functional.softplus.html

torch.nn.functional.softplus

torch.nn.functional.softplus(input, beta=1, threshold=20) → Tensor

Applies element-wise, the function Softplus ( x ) = 1 β log ( 1 + exp ( β x ) ) \text{Softplus}(x) = \frac{1}{\beta} * \log(1 + \exp(\beta * x)) .

For numerical stability the implementation reverts to the linear function when i n p u t × β > t h r e s h o l d input \times \beta > threshold .

See Softplus for more details.

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.functional.softplus.html