Open Caixy1113 opened 3 days ago
Hi, thanks for your interests in our work. It's not suggested to use batch size of 1 since it would be very slow for training. In our case, we use 2 A6000 GPUs (VRAM=48GB) with batch size of 14, and it took around 12 days for training the primitive.
Hello! Thank you for your excellent work. I would like to inquire about the GPU resources for training Cliport. When the batch size is set to 1, approximately how much GPU memory will be used? Also, how long does it take to train for 600k steps? Thank you!