CosineAnnealingLR

CosineAnnealingLRΒΆ

class modelzoo.common.pytorch.optim.lr_scheduler.CosineAnnealingLR (optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, T_max: int, eta_min: float, disable_lr_steps_reset: bool = False)

Set the learning rate of each parameter group using a cosine annealing schedule, where πœ‚π‘šπ‘Žπ‘₯ is set to the initial lr and π‘‡π‘π‘’π‘Ÿ is the number of steps since the last restart in SGDR:

../../../_images/cosine-annealing-lr.png

Notice that because the schedule is defined recursively, the learning rate can be simultaneously modified outside this scheduler by other operators. If the learning rate is set solely by this scheduler, the learning rate at each step becomes:

../../../_images/cosine-annealing-lr2.png

It has been proposed in SGDR: Stochastic Gradient Descent with Warm Restarts. Note that this only implements the cosine annealing part of SGDR, and not the restarts.

This class is similar to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.CosineAnnealingLR.html#torch.optim.lr_scheduler.CosineAnnealingLR

Parameters:

optimizer – The optimizer to schedule

initial_learning_rate – The initial learning rate

T_max – Maximum number of iterations

eta_min – Minimum learning rate