.. _supported-pytorch-ops: Supported PyTorch Ops ===================== Cerebras currently supports the following PyTorch operations. .. attention:: The following list of supported PyTorch ops is very preliminary. We cannot guarantee that mixing and matching them in your models will work. Support is only provided for the way they are used in the `Cerebras Reference Samples Git repository `_. nn -- - `torch.nn.BCEWithLogitsLoss `_ - `torch.nn.CrossEntropyLoss `_ - `torch.nn.Dropout `_ - `torch.nn.Embedding `_ - `torch.nn.functional.dropout `_ - `torch.nn.functional.gelu `_ - `torch.nn.functional.pad `_ - `torch.nn.functional.softmax `_ - `torch.nn.LayerNorm `_ - `torch.nn.Linear `_ - `torch.nn.MSELoss `_ - `torch.nn.NLLLoss `_ - `torch.nn.ReLU `_ - `torch.nn.Softmax `_ - `torch.nn.TanH `_ Functional ---------- - `torch.nn.functional.log_softmax `_ - `torch.nn.functional.relu `_ - `torch.nn.functional.silu `_ Other ops --------- - `torch.abs `_ - `torch.all `_ - `torch.arange `_ - `torch.broadcast_to `_ - `torch.cat `_ - `torch.einsum `_ - `torch.flatten `_ - `torch.full `_ - `torch.full_like `_ - `torch.gather `_ - `torch.log `_ - `torch.matmul `_ - `torch.min `_ - `torch.ones `_ - `torch.rsqrt `_ - `torch.sigmoid `_ - `torch.sum `_ - `torch.tanh `_ - `torch.where `_ - `torch.zeros `_ - `torch.zeros_like `_