X-PLUG / mPLUG-Owl

mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
https://www.modelscope.cn/studios/damo/mPLUG-Owl
MIT License
2.25k stars 171 forks source link

If I want to finetune based on your checkpoint with lora,which one should I use?[mplug-owl-llama-7b-ft] or [mplug-owl-llama-7b] ? #144

Open hangzeli08 opened 1 year ago

hangzeli08 commented 1 year ago

What are the detailed differences between Instruction tuning (LoRA) and Instruction tuning (FT) ? If I want to finetune based on your checkpoint with lora,which one should I use?[mplug-owl-llama-7b-ft] or [mplug-owl-llama-7b] ?

hangzeli08 commented 1 year ago

请问如果我想在您的微调后的预训练节点上进行继续微调,使用lora的方式,我应该用哪个checkpoint节点,mplug-owl-llama-7b-ft 还是mplug-owl-llama-7b ?

ambroser53 commented 1 year ago

I would also like to know this!

rohan598 commented 1 year ago

Yes, me too! Can you please help with this @MAGAer13 @LukeForeverYoung

zhyhome commented 6 months ago

I would also like to know this!