Open hangzeli08 opened 1 year ago
请问如果我想在您的微调后的预训练节点上进行继续微调,使用lora的方式,我应该用哪个checkpoint节点,mplug-owl-llama-7b-ft 还是mplug-owl-llama-7b ?
I would also like to know this!
Yes, me too! Can you please help with this @MAGAer13 @LukeForeverYoung
I would also like to know this!
What are the detailed differences between Instruction tuning (LoRA) and Instruction tuning (FT) ? If I want to finetune based on your checkpoint with lora,which one should I use?[mplug-owl-llama-7b-ft] or [mplug-owl-llama-7b] ?