sxyu / svox2

Plenoxels: Radiance Fields without Neural Networks
BSD 2-Clause "Simplified" License
2.79k stars 359 forks source link

Encounter memory issue when training my own dataset. #30

Closed MarcoG5 closed 2 years ago

MarcoG5 commented 2 years ago

First I would like to thank you for your great work and making it public! I noticed that all the ground truth data needs to be read into memory at once. I understand that it can speed up the training procedure. But if I use my own dataset with large resolution or large data size, my system memory is not enough for training. Is there any method not to read all data at once? Or this algorithm is designed to be this way?

pwais commented 2 years ago

you might be able to refactor Rays to lazy-load images in __getitem__()

a quicker hack (that may work just as well) is to just create a large swapfile (if you have an SSD) https://www.digitalocean.com/community/tutorials/how-to-add-swap-space-on-ubuntu-16-04 Some amount of Rays will need to fit in GPU RAM tho

MarcoG5 commented 2 years ago

you might be able to refactor Rays to lazy-load images in __getitem__()

a quicker hack (that may work just as well) is to just create a large swapfile (if you have an SSD) https://www.digitalocean.com/community/tutorials/how-to-add-swap-space-on-ubuntu-16-04 Some amount of Rays will need to fit in GPU RAM tho

Thank u for ur quick answer, I will look into it!