from unsloth import FastLanguageModel
model = GemmaForCausalLM.from_pretrained('PATH TO GEMMA2B')
model = FastLanguageModel.get_peft_model(
model,
r = 16,
target_modules = ["q_proj", "k_proj", "v_proj", "o_proj",
"gate_proj", "up_proj", "down_proj",
"lm_head", "embed_tokens",],
lora_alpha = 16,
)
The first bug I met, I directly commented it out
Traceback (most recent call last):
File "/data/ruanjh/best_training_method/sft/naive_train.py", line 195, in <module>
model = FastLanguageModel.get_peft_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/unsloth/models/llama.py", line 1736, in get_peft_model
assert(max_seq_length <= model.max_seq_length)
^^^^^^^^^^^^^^^^^^^^
File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1709, in __getattr__
raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'GemmaForCausalLM' object has no attribute 'max_seq_length'
The next:
model = FastLanguageModel.get_peft_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/unsloth/models/llama.py", line 1899, in get_peft_model
_saved_temp_tokenizer = model._saved_temp_tokenizer
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1709, in __getattr__
raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
AttributeError: 'GemmaForCausalLM' object has no attribute '_saved_temp_tokenizer'
Only after I commented out this attribute from the unsloth source code above did it work properly, but I'm not sure if I always need to do this.
I just run
The first bug I met, I directly commented it out
The next:
Only after I commented out this attribute from the unsloth source code above did it work properly, but I'm not sure if I always need to do this.