pytorch / 2 / generated / torch.nn.functional.glu.html

torch.nn.functional.glu

torch.nn.functional.glu(input, dim=-1) → Tensor [source]

The gated linear unit. Computes:

GLU ( a , b ) = a σ ( b ) \text{GLU}(a, b) = a \otimes \sigma(b)

where input is split in half along dim to form a and b, σ \sigma is the sigmoid function and \otimes is the element-wise product between matrices.

See Language Modeling with Gated Convolutional Networks.

Parameters
  • input (Tensor) – input tensor
  • dim (int) – dimension on which to split the input. Default: -1
Return type

Tensor

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.functional.glu.html