tobran / GALIP

[CVPR2023] A faster, smaller, and better text-to-image model for large-scale training
MIT License
225 stars 25 forks source link

How many epochs does GALIP need when trained on COCO? #16

Closed Di-Zayn closed 10 months ago

Di-Zayn commented 10 months ago

Hello, I decided to train GALIP on my own GPUs and I found the default epoch number in setup file is 3005. Do I really need that many epochs to get a similar performance to the result in article? Thank you.

tobran commented 10 months ago

Hi, thanks to the interest in our work, we trained GALIP on COCO for about 1000epochs (8GPUs, batch size=64). The number of training epochs will vary depending on the number of GPUs and batch size.