ExponentialLR

ExponentialLR

class modelzoo.common.pytorch.optim.lr_scheduler.ExponentialLR (optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, decay_steps: int, decay_rate: float, staircase: bool = False, disable_lr_steps_reset: bool = False)

Decays the learning rate of each parameter group by decay_rate every step.

Similar to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.ExponentialLR.html#torch.optim.lr_scheduler.ExponentialLR

Parameters:

optimizer – The optimizer to schedule

initial_learning_rate – The initial learning rate

decay_steps – Number of steps to perform the decay

decay_rate – The decay rate

staircase – If True, decay the learning rate at discrete intervals