X-PLUG / mPLUG-Owl

mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
https://www.modelscope.cn/studios/damo/mPLUG-Owl
MIT License
2.25k stars 171 forks source link

Ablation on LORA #28

Closed jihan-yin closed 1 year ago

jihan-yin commented 1 year ago

Have you guys tested finetuning the whole llama decoder for the finetuning stage instead of using LoRA? Curious what findings or insights y'all might have there, since I didn't see it included in the paper.

MAGAer13 commented 1 year ago

We haven't done that yet. You can use the script train_it_wo_lora.sh to finetuning the whole LLM. Currently, we are working hard on the next version. Once we have done, we will release the checkpoint. Stay Tuned.

MAGAer13 commented 1 year ago

Hi, we have release the finetuned checkpoint for your reference!