OpenGVLab / InternVL

[CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型
https://internvl.readthedocs.io/en/latest/
MIT License
6.02k stars 466 forks source link

[Docs] A question of Mini-InternVL #660

Open xuliu-cyber opened 3 weeks ago

xuliu-cyber commented 3 weeks ago

📚 The doc issue

Hello, I read the paper "Mini-InternVL: A Flexible-Transfer Pocket Multimodal Model with 5% Parameters and 90% Performance", I wonder that during the domain adaptation phase, do you fine-tune three models separately in each domain? Thank you!

Suggest a potential alternative/fix

No response

G-z-w commented 2 weeks ago

We fine-tune models separately for each domain.