PolynomialLR

PolynomialLR

class modelzoo.common.pytorch.optim.lr_scheduler.PolynomialLR (optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, end_learning_rate: float, decay_steps: int, power: float = 1.0, cycle: bool = False, disable_lr_steps_reset: bool = False)

Decays the learning rate of each parameter group using a polynomial function in the given decay_steps.

Similar to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.PolynomialLR.html#torch.optim.lr_scheduler.PolynomialLR

Parameters:

optimizer – The optimizer to schedule

initial_learning_rate – The initial learning rate.

end_learning_rate – The final learning rate

decay_steps – Number of steps to perform the decay

power – Exponent to apply to “x” (as in y=mx+b), which is ratio of step completion (1 for linear) Default: 1.0 (only Linear supported at the moment)

cycle – Whether to cycle