PKU-YuanGroup / MoE-LLaVA

Mixture-of-Experts for Large Vision-Language Models
https://arxiv.org/abs/2401.15947
Apache License 2.0
1.97k stars 125 forks source link

[Question] 论文table 7的 non-MoE LLaVA-phi的train scripts和eval scripts #92

Open sharkdrop opened 1 month ago

sharkdrop commented 1 month ago

Question

作者您好,根据您提供的stage 2 checkpoints,我已经成功复现了MoE-LLaVA-Phi-2.7B-4e的stage 3训练,我想进一步复现table 7中non-MoE-LLaVA-Phi的stage 3训练,不知能否提供一下train scripts以及eval scripts,感谢!!