Open Saunak626 opened 2 weeks ago
Yes, we use two GPUs with 24GB during training due to the resource limit. But, it won't take much time to train as we initialize with the pre-trained weight trained on the Kinetics dataset.
Hello! I noticed you mentioned that "we use two GPUs with 24GB during training due to the resource limit. But, it won't take much time to train as we initialize with the pre-trained weight trained on the Kinetics dataset." I am reaching out to inquire about the approximate time required per epoch when training on the EPIC dataset under similar settings. With my setup of two 24GB GPUs (NVIDIA 3090), each epoch is taking approximately 3 hours to complete. Could you kindly share how long it typically took for you per epoch?
The training time is approximately half a day for the pretraining stage on the Ego4D-EPIC scenario and 1.5 days for the multimodal distillation stage. We used two RTX 4090 GPUs for these trainings.
config_pretrain.yaml accelerator: gpu devices: [0, 1]
Hello! Is this only using two graphics cards? How much video memory?