Open karthik101200 opened 5 months ago
Can you print full error message and clarify: (1) how many images did you use? what is the resolution? (2) how many points when OOM?
109 images. resolution is default i think you have kept it -1 and around 15 points at OOM
the number of images are large as it is quiet a big unbounded scene like a simulation env of hospital from a robot POV and the mesh created although it was incomplete but quite promising after 4.5k iterations hence wanted to train it more
You can store the images into cpu device for reducing gpu memory consumption. see PR https://github.com/hbb1/2d-gaussian-splatting/pull/45
so just to confirm i need to pass --data_device cpu right? Because when I do that I get OOM but with a message of higher allocation CUDA out of memory. Tried to allocate 72.07 GiB (GPU 0; 23.49 GiB total capacity; 159.48 MiB already allocated; 22.58 GiB free; 210.00 MiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
You need follow the PR to make necessary changes. very minor changes:
https://github.com/hbb1/2d-gaussian-splatting/pull/45/commits/906bf01347302cbdb70edb06047531d22ef78b16 https://github.com/hbb1/2d-gaussian-splatting/pull/45/commits/99bd1533ad2c77cf6e581be37aa62204ef7bd2b3
num_rendered, color, depth, radii, geomBuffer, binningBuffer, imgBuffer = _C.rasterize_gaussians(*args) torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 35.79 GiB (GPU 0; 23.49 GiB total capacity; 990.78 MiB already allocated; 21.66 GiB free; 1.14 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
35.79 gb seems like a lot?
When I run on more powerful mobile GPU (Ada A2000) but with less VRAM (8GB) it starts training but goes out of memory after 4.5k iterations. Any way to solve either of the issues?