Supported PyTorch Optimizers

Supported PyTorch Optimizers

These are drop-in replacements from modelzoo.common.optim.* and can replace torch.nn.optim.* without any modifications.

The following optimizers work only in weight streaming mode:

modelzoo.common.optim.Adadelta

Implements Adadelta algorithm.

modelzoo.common.optim.Adagrad

Implements Adagrad algorithm.

modelzoo.common.optim.Adam

Implements Adam algorithm.

modelzoo.common.optim.AdamW

Implements AdamW algorithm.

modelzoo.common.optim.Adamax

Implements Adamax algorithm (a variant of Adam based on infinity norm).

modelzoo.common.optim.ASGD

Implements Averaged Stochastic Gradient Descent.

modelzoo.common.optim.NAdam

Implements NAdam algorithm.

modelzoo.common.optim.RAdam

Implements RAdam algorithm.

modelzoo.common.optim.RMSProp

Implements RMSprop algorithm.

modelzoo.common.optim.Rprop

Implements the resilient backpropagation algorithm.

modelzoo.common.optim.SGD

Implements stochastic gradient descent (optionally with momentum).