pytorch / 1.8.0 / generated / torch.nn.silu.html /

SiLU

class torch.nn.SiLU(inplace=False) [source]

Applies the silu function, element-wise.

silu ( x ) = x σ ( x ) , where σ ( x ) is the logistic sigmoid. \text{silu}(x) = x * \sigma(x), \text{where } \sigma(x) \text{ is the logistic sigmoid.}

Note

See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.

Shape:
  • Input: ( N , ) (N, *) where * means, any number of additional dimensions
  • Output: ( N , ) (N, *) , same shape as the input

Examples:

>>> m = nn.SiLU()
>>> input = torch.randn(2)
>>> output = m(input)

© 2019 Torch Contributors
Licensed under the 3-clause BSD License.
https://pytorch.org/docs/1.8.0/generated/torch.nn.SiLU.html