On this page
PolynomialLR
class torch.optim.lr_scheduler.PolynomialLR(optimizer, total_iters=5, power=1.0, last_epoch=-1, verbose=False)
[source]-
Decays the learning rate of each parameter group using a polynomial function in the given total_iters. When last_epoch=-1, sets initial lr as lr.
- Parameters
Example
>>> # Assuming optimizer uses lr = 0.001 for all groups >>> # lr = 0.001 if epoch == 0 >>> # lr = 0.00075 if epoch == 1 >>> # lr = 0.00050 if epoch == 2 >>> # lr = 0.00025 if epoch == 3 >>> # lr = 0.0 if epoch >= 4 >>> scheduler = PolynomialLR(self.opt, total_iters=4, power=1.0) >>> for epoch in range(100): >>> train(...) >>> validate(...) >>> scheduler.step()
get_last_lr()
-
Return last computed learning rate by current scheduler.
load_state_dict(state_dict)
-
Loads the schedulers state.
- Parameters
-
state_dict (dict) – scheduler state. Should be an object returned from a call to
state_dict()
.
print_lr(is_verbose, group, lr, epoch=None)
-
Display the current learning rate.
state_dict()
-
Returns the state of the scheduler as a
dict
.It contains an entry for every variable in self.__dict__ which is not the optimizer.
© 2024, PyTorch Contributors
PyTorch has a BSD-style license, as found in the LICENSE file.
https://pytorch.org/docs/2.1/generated/torch.optim.lr_scheduler.PolynomialLR.html