Thanks for your great work! I noticed that downsampling doesn't reduce GPU memory consumption during training. I used different downsample parameters from 0.1 to 1, but memory footprint was always around 6.5 Gb on Lego dataset with default arguments. Is it right, or am I missing something? Because I expected that downsampling will reduce memory footprint.
Hi @kwea123!
Thanks for your great work! I noticed that downsampling doesn't reduce GPU memory consumption during training. I used different downsample parameters from 0.1 to 1, but memory footprint was always around 6.5 Gb on Lego dataset with default arguments. Is it right, or am I missing something? Because I expected that downsampling will reduce memory footprint.