StepLR

StepLR

class modelzoo.common.pytorch.optim.lr_scheduler.StepLR (optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, step_size: int, gamma: float, disable_lr_steps_reset: bool = False)

Decays the learning rate of each parameter group by gamma every step_size. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler. Similar to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.StepLR.html#torch.optim.lr_scheduler.StepLR

Parameters:

optimizer – The optimizer to schedule

initial_learning_rate – The initial learning rate

step_size – Period of learning rate decay

gamma – Multiplicative factor of learning rate decay