pytorch / 2 / generated / torch.ao.nn.quantized.leakyrelu.html

LeakyReLU

class torch.ao.nn.quantized.LeakyReLU(scale, zero_point, negative_slope=0.01, inplace=False, device=None, dtype=None) [source]

This is the quantized equivalent of LeakyReLU.

Parameters
  • scale (float) – quantization scale of the output tensor
  • zero_point (int) – quantization zero point of the output tensor
  • negative_slope (float) – Controls the angle of the negative slope. Default: 1e-2

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.ao.nn.quantized.LeakyReLU.html