Dataset preparation required more than 12GB VRAM on the lego dataset when I tried to train a model with the preset nerf_hash config.
What
Ray tensors are generated as float64 tensors during the dataset preparation in main while they are converted to float32 just a few lines below. This PR changes dtype for the tensor generation to float32. This modification reduced VRAM usage (14GB-ish to 8GB-ish, but may vary) with the preset config.
P.S. The RTMV dataset may need a similar change, but this PR does not include it because I could not work with the dataset due to its size.
Why
Dataset preparation required more than 12GB VRAM on the lego dataset when I tried to train a model with the preset
nerf_hash
config.What
Ray tensors are generated as float64 tensors during the dataset preparation in
main
while they are converted to float32 just a few lines below. This PR changesdtype
for the tensor generation to float32. This modification reduced VRAM usage (14GB-ish to 8GB-ish, but may vary) with the preset config.P.S. The RTMV dataset may need a similar change, but this PR does not include it because I could not work with the dataset due to its size.