X-PLUG / mPLUG-Owl

mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
https://www.modelscope.cn/studios/damo/mPLUG-Owl
MIT License
2.25k stars 171 forks source link

lora finetune multilingual version #131

Closed BaoyanWang closed 1 year ago

BaoyanWang commented 1 year ago

hello sir, what language model did you use in the released multilingual version? I saw that the downloaded one is mplug-owl-bloomz-7b-multilingual. Do you need to modify your Lora fine-tuning code accordingly?

MAGAer13 commented 1 year ago

The bloomz-7b, you should modify the lora layer key (q_proj and v_proj) with query_key_value.