Closed whyiug closed 1 month ago
tune_llm=true
means to change the parameters of the base llm.
You should understand what lora does---reduce full parameter training through new side branches in training. So it makes no sense if you want to train full parameters and train side branches with lora at the same time.
You should understand what lora does---reduce full parameter training through new side branches in training. So it makes no sense if you want to train full parameters and train side branches with lora at the same time.
@980044579
Of course I understand what lora does. the param here (tune_llm
) is ambiguous. Because lora is also a way of fine-tuning the llm.
Hi, when I set up
tune_llm=true
, it reports an error:The model cannot simultaneously adjust LLM parameters and apply LoRA.
I wonder why we can't fine-tune the llm model using lora.