pytorch / 2 / generated / torch.nn.lazybatchnorm2d.html

LazyBatchNorm2d

class torch.nn.LazyBatchNorm2d(eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, device=None, dtype=None) [source]

A torch.nn.BatchNorm2d module with lazy initialization of the num_features argument of the BatchNorm2d that is inferred from the input.size(1). The attributes that will be lazily initialized are weight, bias, running_mean and running_var.

Check the torch.nn.modules.lazy.LazyModuleMixin for further documentation on lazy modules and their limitations.

Parameters
  • eps (float) – a value added to the denominator for numerical stability. Default: 1e-5
  • momentum (float) – the value used for the running_mean and running_var computation. Can be set to None for cumulative moving average (i.e. simple average). Default: 0.1
  • affine (bool) – a boolean value that when set to True, this module has learnable affine parameters. Default: True
  • track_running_stats (bool) – a boolean value that when set to True, this module tracks the running mean and variance, and when set to False, this module does not track such statistics, and initializes statistics buffers running_mean and running_var as None. When these buffers are None, this module always uses batch statistics. in both training and eval modes. Default: True
cls_to_become

alias of BatchNorm2d

© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.LazyBatchNorm2d.html