cerebras.modelzoo.common.utils.model.lora#

Functions

disable_lora_merge_weights

Sets merge_weights=False in LoRA parameters.

get_lora_config_for_module

Gets lora parameters for a particular module

make_model_lora

Create a Low Rank Adaptation (LoRA) model from a non-LoRA model.

make_model_lora_helper

Classes

LoRALayer

Base LoRA layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py.

LoRA_Embedding

LoRA embedding layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py.

LoRA_Linear

LoRA linear layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py.

LoraConfig

r: Rank of LoRA matrix projections alpha: Scaling factor (see paper for additional details) dropout: Dropout to apply to LoRA updates fan_in_fan_out: merge_weights: Determines whether lora weights should be merged/folded into underlying layers target_modules: A list of module names that must all exist in layers that will be converted to LoRA. For example, setting target_modules to ["TransformerDecoderLayer", "Linear"] would mean that all linear layers that were children of a TransformerDecoderLayer would be converted to LoRA.