.. _MultiStepLR: MultiStepLR =========== class ``modelzoo.common.pytorch.optim.lr_scheduler.MultiStepLR`` (optimizer: torch.optim.optimizer.Optimizer, initial_learning_rate: float, gamma: float, milestones: List[int], disable_lr_steps_reset: bool = False) Decays the learning rate of each parameter group by gamma once the number of steps reaches one of the milestones. Notice that such decay can happen simultaneously with other changes to the learning rate from outside this scheduler. Similar to https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.MultiStepLR.html#torch.optim.lr_scheduler.MultiStepLR Parameters: **optimizer** – The optimizer to schedule **initial_learning_rate** – The initial learning rate **gamma** – Multiplicative factor of learning rate decay **milestones** – List of step indices; must be increasing