layer6ai-labs / xpool

https://layer6ai-labs.github.io/xpool/
113 stars 8 forks source link

Question about setting #3

Closed JonnyS1226 closed 2 years ago

JonnyS1226 commented 2 years ago

hello, thank you for your very nice work! And i have a question about setting. That is, what kind of and the number of GPU you use in the experiment? Thanks!

NoelVouitsis commented 2 years ago

Hi there, thank you for taking interest in our work! We used a single NVIDIA Titan RTX GPU with 24GB of memory for our experiments. Thank you!

1024er commented 2 years ago

How long does each experiment take using one single gpu ? @NoelVouitsis

NoelVouitsis commented 2 years ago

It depends on the dataset. For MSR-VTT it took around 8 hours. Thanks!

jianghaojun commented 2 years ago

@NoelVouitsis I tried to reproduce the results on MST-VTT-9K, it took 10h to run 1 epoch(on one singe RTX3090). Do you know of any factors that may have contributed to this situation?

NoelVouitsis commented 2 years ago

Maybe you can experiment with batch_size and evals_per_epoch and num_workers. Also you can consider where you are storing the MSR-VTT dataset since if it's stored on a different disk the I/O can be bottleneck. Thanks!

NoelVouitsis commented 2 years ago

Also compressing videos to 3fps using the code in preprocess/compress_video.py can significantly speed up training

jianghaojun commented 2 years ago

Thanks for your suggestions.