X-PLUG / mPLUG-Owl

mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
https://www.modelscope.cn/studios/damo/mPLUG-Owl
MIT License
2.25k stars 171 forks source link

Possibility pre-training using quantified LLM. #142

Open pUmpKin-Co opened 1 year ago

pUmpKin-Co commented 1 year ago

Hi~Thanks for your nice work. I want to know that in the first pre-training state. Is it possible to align the Vision Part using the quantified LLM (8-bit or other similar stuff.)? Or it wiil intefer the alignment?