I want to monitor the change of PSNR during training, so I increase the number of test_iterations, measure PSNR every 1k rounds, and add scene.gaussians.precompute() in training_report, but it will report cuda out of memory error, Traceback (most recent call last):
File "train.py", line 235, in
training(lp.extract(args), op.extract(args), pp.extract(args), args.test_iterations, args.save_iterations, args.checkpoint_iterations, args.start_checkpoint, args.debug_from, args.comp, args.store_npz)
File "train.py", line 113, in training
training_report(tb_writer, iteration, Ll1, loss, l1_loss, iter_start.elapsed_time(iter_end), testing_iterations, scene, render, (pipe, background))
File "train.py", line 185, in training_report
image = torch.clamp(renderFunc(viewpoint, scene.gaussians, renderArgs)["render"], 0.0, 1.0)
File "/home/zbw/Compact-3DGS/gaussian_renderer/init.py", line 94, in render
cov3D_precomp = None)
File "/home/zbw/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(input, *kwargs)
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 219, in forward
raster_settings,
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 41, in rasterize_gaussians
raster_settings,
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 92, in forward
num_rendered, color, radii, geomBuffer, binningBuffer, imgBuffer = _C.rasterize_gaussians(args)
RuntimeError: CUDA out of memory. Tried to allocate 34.41 GiB (GPU 0; 31.75 GiB total capacity; 2.10 GiB already allocated; 27.04 GiB free; 2.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Have you ever seen this situation? How did you solve it? Thank you.
Precomputing is a trick for faster FPS in testing time, so it is not needed during training.
To monitor PSNR, you can forward without precomputing as in training iterations.
I want to monitor the change of PSNR during training, so I increase the number of test_iterations, measure PSNR every 1k rounds, and add scene.gaussians.precompute() in training_report, but it will report cuda out of memory error, Traceback (most recent call last): File "train.py", line 235, in
training(lp.extract(args), op.extract(args), pp.extract(args), args.test_iterations, args.save_iterations, args.checkpoint_iterations, args.start_checkpoint, args.debug_from, args.comp, args.store_npz)
File "train.py", line 113, in training
training_report(tb_writer, iteration, Ll1, loss, l1_loss, iter_start.elapsed_time(iter_end), testing_iterations, scene, render, (pipe, background))
File "train.py", line 185, in training_report
image = torch.clamp(renderFunc(viewpoint, scene.gaussians, renderArgs)["render"], 0.0, 1.0)
File "/home/zbw/Compact-3DGS/gaussian_renderer/init.py", line 94, in render
cov3D_precomp = None)
File "/home/zbw/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
return forward_call(input, *kwargs)
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 219, in forward
raster_settings,
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 41, in rasterize_gaussians
raster_settings,
File "/home/zbw/anaconda3/envs/c3dgs/lib/python3.7/site-packages/diff_gaussian_rasterization/init.py", line 92, in forward
num_rendered, color, radii, geomBuffer, binningBuffer, imgBuffer = _C.rasterize_gaussians(args)
RuntimeError: CUDA out of memory. Tried to allocate 34.41 GiB (GPU 0; 31.75 GiB total capacity; 2.10 GiB already allocated; 27.04 GiB free; 2.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Have you ever seen this situation? How did you solve it? Thank you.