cerebras_pytorch.optim
#
Available optimizers in the cerebras_pytorch
package
Adadelta
Adafactor
Adagrad
Adamax
AdamW
Adam
ASGD
Lamb
Lion
NAdam
RAdam
RMSprop
Rprop
SGD
optim.ASGD#
optim.Adadelta#
optim.Adafactor#
optim.Adagrad#
optim.AdamBase#
optim.Adam#
optim.AdamW#
optim.Adamax#
optim.Lamb#
optim.Lion#
optim.NAdam#
optim.RAdam#
optim.RMSprop#
optim.Rprop#
optim.SGD#
optim.Optimizer#
optim helpers#
Learning Rate Schedulers in cerebras_pytorch
#
- Available learning rate schedulers in the
cerebras_pytorch
package ConstantLR
PolynomialLR
LinearLR
ExponentialLR
InverseExponentialTimeDecayLR
InverseSquareRootDecayLR
CosineDecayLR
SequentialLR
PiecewiseConstantLR
MultiStepLR
StepLR
CosineAnnealingLR
LambdaLR
CosineAnnealingWarmRestarts
MultiplicativeLR
ChainedScheduler