ingra14m / Deformable-3D-Gaussians

[CVPR 2024] Official implementation of "Deformable 3D Gaussians for High-Fidelity Monocular Dynamic Scene Reconstruction"
https://ingra14m.github.io/Deformable-Gaussians/
MIT License
884 stars 49 forks source link

Possible memory leak #16

Closed pablodawson closed 10 months ago

pablodawson commented 10 months ago

Hey,

Even after your latest commit, I see the GPU memory going up as training progresses, and getting slower:

On a 4090 GPU (24gb VRAM), for example: Iter 5000: 15% GPU memory usage, ETA: 36:28 Iter 14000: 65% GPU memory usage, ETA: 1:20:10

Is this normal? Maybe another detach() missing somewhere?

Thanks!

ingra14m commented 10 months ago

Hi, this is actually quite normal.

The memory overhead of VRAM mainly consists of several parts:

  1. the input image (which is loaded onto the GPU by default unless --load2gpu_on_the_fly is used),
  2. the MLP query,
  3. the rasterization of the 3D Gaussians,
  4. the model parameters of the 3D Gaussian (such as XYZ, rotation, scaling, opacity, etc.).

In my experiments, the VRAM overhead of the query MLP is not significant. The increase in VRAM usage is because before 15k, the 3D Gaussian in the canonical space is continuously densifying.