kwea123 / ngp_pl

Instant-ngp in pytorch+cuda trained with pytorch-lightning (high quality with high speed, with only few lines of legible code)
MIT License
1.26k stars 156 forks source link

Downsample parameter doesn't reduce GPU memory consumption #101

Closed salykova closed 1 year ago

salykova commented 1 year ago

Hi @kwea123!

Thanks for your great work! I noticed that downsampling doesn't reduce GPU memory consumption during training. I used different downsample parameters from 0.1 to 1, but memory footprint was always around 6.5 Gb on Lego dataset with default arguments. Is it right, or am I missing something? Because I expected that downsampling will reduce memory footprint.