Enable kernel generalizability with Autogen#
Autogen enables more generalizability in variations of ML models supported on Cerebras hardware without the need for handwritten kernels. Autogen can be enabled by adding
autogen_policy: medium in runconfig section of yaml for non-loss operations and can be enabled by
use_autogen=True when defining loss functions.
Example use cases#
Enable of changing non-linearity#
You may modify the non-linearity of GPT-style models in Cerebras Model Zoo (e.g., to LeakyRelu) by modifying the parameters yaml file:
model: nonlinearity: "leaky_relu" ...
When compiling the model with this change, you may see the following error at compilation
KM to RT IR Translation Failed: <unknown>:0: error: loc("custom-call.2473"): failed to legalize operation 'cirh.LeakyRelu'
autogen_policy: "medium" in the runconfig section of the parameters yaml file, you will be able to compile using the LeakyRelu non-linearity.
Improve performance of loss functions#
As an experimental feature, you can enable autogen to use fused autogenerated graphs for losses in PyTorch. This experimental feature attemps to improve the performance of these losses in comparison to previous implementations using primitive kernels.
List of supported losses: Loss.L1Loss, Loss.MSELoss, PoissonNLLLoss, CrossEntropyLoss, BCELoss, Loss.KLDivLoss, Loss.BCEWithLogitsLoss, Loss.SmoothL1Loss, Loss.MarginRankingLoss, Loss.HingeEmbeddingLoss, Loss.GaussianNLLLoss, Loss.HuberLoss, Loss.MultiMarginLoss, Loss.NLLLoss, Loss.MultiLabelSoftMarginLoss, Loss.TripletMarginLoss, Loss.TripletMarginWithDistanceLoss
Not supported losses: Loss.CosineEmbeddingLoss will compile to primitive kernels and performance will be slower.
To use this feature, in the definition of the model architecture, use
use_autogen = True in the loss definition. For example
loss = MSELoss(..., use_autogen=True)
AutoGen for custom loss functions#
You may encounter compilation failure due to a graph mismatch for user-created custom losses. If this occurs, enable AutoGen for the customized loss by adding the AutoGen wrapper autogen_loss as a decorator for the loss class.
from modelzoo.common.pytorch.layers.utils import autogen_loss @autogen_loss class CustomLoss(nn.Module): def __init__(self, ...):