X-PLUG / mPLUG-Owl

mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
https://www.modelscope.cn/studios/damo/mPLUG-Owl
MIT License
2.33k stars 176 forks source link

Total GPU hours for training. #89

Closed praeclarumjj3 closed 1 year ago

praeclarumjj3 commented 1 year ago

Hi, thanks for the great work!

You mentioned the GPU requirements in #29 as 32 A100 GPUs for the first stage and a single A100 for the second stage. Could you also provide information about the training duration?

Thanks.

GallonDeng commented 1 year ago

same

MAGAer13 commented 1 year ago

About 7 days.