.. _supported-pytorch-optimizers: Supported PyTorch Optimizers ============================ These are drop-in replacements from ``modelzoo.common.optim.*`` and can replace ``torch.nn.optim.*`` without any modifications. The following optimizers work only in weight streaming mode: `modelzoo.common.optim.Adadelta `_ Implements Adadelta algorithm. `modelzoo.common.optim.Adagrad `_ Implements Adagrad algorithm. `modelzoo.common.optim.Adam `_ Implements Adam algorithm. `modelzoo.common.optim.AdamW `_ Implements AdamW algorithm. `modelzoo.common.optim.Adamax `_ Implements Adamax algorithm (a variant of Adam based on infinity norm). `modelzoo.common.optim.ASGD `_ Implements Averaged Stochastic Gradient Descent. `modelzoo.common.optim.NAdam `_ Implements NAdam algorithm. `modelzoo.common.optim.RAdam `_ Implements RAdam algorithm. `modelzoo.common.optim.RMSProp `_ Implements RMSprop algorithm. `modelzoo.common.optim.Rprop `_ Implements the resilient backpropagation algorithm. `modelzoo.common.optim.SGD `_ Implements stochastic gradient descent (optionally with momentum).