graphdeco-inria / gaussian-splatting

Original reference implementation of "3D Gaussian Splatting for Real-Time Radiance Field Rendering"
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
Other
13.39k stars 1.7k forks source link

CUDA out of memory #890

Open cjw1005 opened 1 month ago

cjw1005 commented 1 month ago

Optimizing data/truck/output Output folder: data/truck/output [16/07 12:09:31] Tensorboard not available: not logging progress [16/07 12:09:31] Reading camera 251/251 [16/07 12:09:43] Loading Training Cameras [16/07 12:09:43] Loading Test Cameras [16/07 12:09:53] Number of points at initialisation : 136029 [16/07 12:09:53] Training progress: 0%| | 0/30000 [00:00<?, ?it/s] Traceback (most recent call last): File "train.py", line 226, in training(lp.extract(args), op.extract(args), pp.extract(args), args.test_iterations, args.save_iterations, args.checkpoint_iterations, args.start_checkpo int, args.debug_from) File "train.py", line 123, in training gaussians.add_densification_stats(viewspace_point_tensor, visibility_filter) File "/home/jwcui/gaussian-splatting/scene/gaussian_model.py", line 406, in add_densification_stats self.xyz_gradient_accum[update_filter] += torch.norm(viewspace_point_tensor.grad[update_filter,:2], dim=-1, keepdim=True) RuntimeError: CUDA out of memory. Tried to allocate 7.94 GiB (GPU 0; 23.67 GiB total capacity; 13.48 GiB already allocated; 4.58 GiB free; 17.59 GiB reserved in total by PyTorch) Training progress: 0%| | 0/30000 [00:00<?, ?it/s] I used the official dataset.

jaco001 commented 1 month ago

You have a lot of VRAM for this dataset. Just close web browsers and other software.

cjw1005 commented 1 month ago

You have a lot of VRAM for this dataset. Just close web browsers and other software.

The truck dataset I used contained only 251 photos, and I reported the same errors even after shutting down the other software. image

jaco001 commented 1 month ago

I don't have problem with truck dataset, but I use different CUDA 11.8. If you think that it is dataset problem, just preprocess it like resize/resharp and try again.