Closed guxm2021 closed 10 months ago
Hello, I have met the similar problem, do you know whether --mm_projector_lr
works when setting --learning_rate
? I'm confused about the priority of --mm_projector_lr
and --learning_rate 2e-4
@guxm2021
optimizer_cls, optimizer_kwargs = Trainer.get_optimizer_cls_and_kwargs(self.args)
self.optimizer = optimizer_cls(optimizer_grouped_parameters, **optimizer_kwargs)
@haotian-liu
I just can not figure out whether optimizer_grouped_parameters
func in llava_trainer.py
works when --learning_rate
is set
Describe the issue
When I fine-tune the llava-1.5-7b using
scripts/v1_5/finetune_lora.sh
with minimal changes, here is my commandCommand:
I encountered the value error: Log:
When I remove the argument
--mm_projector_lr 2e-5
in the command, the command can work. But I don't know whether I could get the correct model.