Shengqiang-Zhang / LoHo-Ravens

Official code for the long-horizon language-conditioned robotic manipulation benchmark LoHoRavens.
Apache License 2.0
8 stars 0 forks source link

Inquiry About GPU of training #4

Open Caixy1113 opened 3 days ago

Caixy1113 commented 3 days ago

Hello! Thank you for your excellent work. I would like to inquire about the GPU resources for training Cliport. When the batch size is set to 1, approximately how much GPU memory will be used? Also, how long does it take to train for 600k steps? Thank you!

Shengqiang-Zhang commented 2 days ago

Hi, thanks for your interests in our work. It's not suggested to use batch size of 1 since it would be very slow for training. In our case, we use 2 A6000 GPUs (VRAM=48GB) with batch size of 14, and it took around 12 days for training the primitive.