Haiyang-W / DSVT

[CVPR2023] Official Implementation of "DSVT: Dynamic Sparse Voxel Transformer with Rotated Sets"
https://arxiv.org/abs/2301.06051
Apache License 2.0
361 stars 28 forks source link

GPU memory size #23

Closed skprot closed 1 year ago

skprot commented 1 year ago

Thanks for your incredible work!

  1. I'm wondering how much minimum GPU memory should be in my setup to run training?
  2. I also did not find information in the article how much GPU memory should be for model inference?
Haiyang-W commented 1 year ago

Hi,

  1. If you use DSVT-P, batch size = 1 and fp16, It only requires 9GB for Training.
  2. All inference variants are fp32, and require about 3~4GB GPU memory.