ByungKwanLee / MoAI

[ECCV 2024] Official PyTorch implementation code for realizing the technical part of Mixture of All Intelligence (MoAI) to improve performance of numerous zero-shot vision language tasks.
MIT License
305 stars 31 forks source link

What hardware configuration is required for training? #5

Open TongkunGuan opened 6 months ago

TongkunGuan commented 6 months ago

Great work!

I would like to consult with you about the specific details of the training process, including the type of GPU used (e.g., 3090, V100, etc.), the number of GPUs, and the duration of the training in days. Could you provide this information?

Thanks!

ByungKwanLee commented 6 months ago

Thanks for interest in our work!

Each training step takes two~three days with A6000x6EA on one batch. The reason we use only one batch is from that gathering image, language parts has technical issue, therefore inference code still has this issue, where the code should run with one batch theoretically.