PRBonn / segcontrast

MIT License
96 stars 13 forks source link

About the time cost of pre-training #17

Closed JUNJIE99 closed 1 year ago

JUNJIE99 commented 1 year ago

Thank for your great work! I am trying to reproduce the paper with two GeForce RTX 3090 GPUs. But I found that it was expected to take 25 days to complete training. I would like to ask how much time it takes you to pre-training once with four NVIDIA GTX1080TI 12 GB GPUs? I want to know if there is something wrong with my experiment. Thank you!

nuneslu commented 1 year ago

Hi! In our setup we used four GPUs with 12GB and it took one week. So I guess that with one GPU 25 days is around the correct time. But also noticed that with only 50 epochs (instead of 200 that we pre-trained in the paper) the pre-training was already achieving results close to the full pre-training.

JUNJIE99 commented 1 year ago

I see! Thanks for your reply.