Closed yunsangju closed 4 months ago
This issue has been fixed by the commit 3c522e7. You can pull the latest version of fine-tune code to check if the problem still exists.
@yunsangju
@yuhangzang @myownskyW7 hello. When I proceed with the code below, PLoRA still disappears.
for name, param in model.model.named_parameters():
param.requires_grad = False
lora_config = LoraConfig(
r=lora_args.lora_r,
lora_alpha=lora_args.lora_alpha,
target_modules=lora_args.lora_target_modules,
lora_dropout=lora_args.lora_dropout,
bias=lora_args.lora_bias,
task_type='CAUSAL_LM',
)
model = get_peft_model(model, lora_config)
hello. When I apply lora to the decoder, the existing model structure, PLoRA, is converted to linear. I don't think this will utilize the existing PLoRA that you have learned. Is this okay?
Before applying lora
After applying lora