On this page
LazyBatchNorm1d
class torch.nn.LazyBatchNorm1d(eps=1e-05, momentum=0.1, affine=True, track_running_stats=True, device=None, dtype=None)
[source]-
A
torch.nn.BatchNorm1d
module with lazy initialization of thenum_features
argument of theBatchNorm1d
that is inferred from theinput.size(1)
. The attributes that will be lazily initialized areweight
,bias
,running_mean
andrunning_var
.Check the
torch.nn.modules.lazy.LazyModuleMixin
for further documentation on lazy modules and their limitations.- Parameters
-
- eps (float) – a value added to the denominator for numerical stability. Default: 1e-5
- momentum (float) – the value used for the running_mean and running_var computation. Can be set to
None
for cumulative moving average (i.e. simple average). Default: 0.1 - affine (bool) – a boolean value that when set to
True
, this module has learnable affine parameters. Default:True
- track_running_stats (bool) – a boolean value that when set to
True
, this module tracks the running mean and variance, and when set toFalse
, this module does not track such statistics, and initializes statistics buffersrunning_mean
andrunning_var
asNone
. When these buffers areNone
, this module always uses batch statistics. in both training and eval modes. Default:True
cls_to_become
-
alias of
BatchNorm1d
© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.nn.LazyBatchNorm1d.html