cerebras_pytorch.optim#

Available optimizers in the cerebras_pytorch package

Adadelta

Adafactor

Adagrad

Adamax

AdamW

Adam

ASGD

Lamb

Lion

NAdam

RAdam

RMSprop

Rprop

SGD

optim.ASGD#

optim.Adadelta#

optim.Adafactor#

optim.Adagrad#

optim.AdamBase#

optim.Adam#

optim.AdamW#

optim.Adamax#

optim.Lamb#

optim.Lion#

optim.NAdam#

optim.RAdam#

optim.RMSprop#

optim.Rprop#

optim.SGD#

optim.Optimizer#

optim helpers#

Learning Rate Schedulers in cerebras_pytorch#

Available learning rate schedulers in the cerebras_pytorch package

ConstantLR

PolynomialLR

LinearLR

ExponentialLR

InverseExponentialTimeDecayLR

InverseSquareRootDecayLR

CosineDecayLR

SequentialLR

PiecewiseConstantLR

MultiStepLR

StepLR

CosineAnnealingLR

LambdaLR

CosineAnnealingWarmRestarts

MultiplicativeLR

ChainedScheduler

optim.lr_scheduler.LRScheduler#

optim.lr_scheduler.ConstantLR#

optim.lr_scheduler.PolynomialLR#

optim.lr_scheduler.LinearLR#

optim.lr_scheduler.ExponentialLR#

optim.lr_scheduler.InverseExponentialTimeDecayLR#

optim.lr_scheduler.InverseSquareRootDecayLR#

optim.lr_scheduler.CosineDecayLR#

optim.lr_scheduler.SequentialLR#

optim.lr_scheduler.PiecewiseConstantLR#

optim.lr_scheduler.MultiStepLR#

optim.lr_scheduler.StepLR#

optim.lr_scheduler.CosineAnnealingLR#

optim.lr_scheduler.LambdaLR#

optim.lr_scheduler.CosineAnnealingWarmRestarts#

optim.lr_scheduler.MultiplicativeLR#

optim.lr_scheduler.ChainedScheduler#

optim.lr_scheduler.CyclicLR#

optim.lr_scheduler.OneCycleLR#