Closed JasonTian1091 closed 1 month ago
I use CUDA_VISIBLE_DEVICES=0 python train_gui.py --source_path data/plate_novel_view --model_path outputs/plate_test --deform_type node --node_num 512 --hyper_dim 8 --eval --gt_alpha_mask_as_scene_mask --local_frame --resolution 2 --W 800 --H 800 --random_init_deform_gs
I have encountered the same error as you. How did you solve it?
Hi, I encounter a problem when running train_gui.py on the NeRF-DS dataset:
Optimizing outputs/plate_test_randomeinit_node Output folder: outputs/plate_test_randomeinit_node [12/09 15:56:34] Building Learnable Gaussians for Nodes! [12/09 15:56:34] Loading trained model at iteration None [12/09 15:56:34] Found dataset.json file, assuming Nerfies data set! [12/09 15:56:34] Reading Nerfies Info [12/09 15:56:34] [12/09 15:56:35] Loading Training Cameras [12/09 15:56:35] Loading Test Cameras [12/09 15:56:36] Number of points at initialisation : 9714 [12/09 15:56:39] Generating random point cloud (100000)... [12/09 15:56:39] Initialize Learnable Gaussians for Nodes with Point Clouds! [12/09 15:56:39] Control node initialized with 512 from 100000 points. [12/09 15:56:39] Training progress: 0%| | 0/80000 [00:00<?, ?it/s]Initialize Learnable Gaussians for Nodes with Point Clouds! [12/09 16:00:14] | 7497/80000 [03:34<41:19, 29.24it/s] Control node initialized with 512 from 83997 points. [12/09 16:00:14] Best PSNR=0.00000 in Iteration 0, SSIM=0.00000, LPIPS=inf, MS-SSIM=0.00000, ALex-LPIPS=inf: 4%| | 2990/80000 [04:44<17:53, [ITER 2999] Saving Gaussians [12/09 16:01:24] | 12996/80000 [04:44<15:37, 71.50it/s] 19%|███████████████▎ | 14998/80000 [07:55<34:20, 31.55it/s] Traceback (most recent call last): | 14998/80000 [07:54<2:07:55, 8.47it/s] File "train_gui.py", line 1886, in
gui.train(args.iterations)
File "train_gui.py", line 1002, in train
self.train_step()
File "train_gui.py", line 1172, in train_step
cur_psnr, cur_ssim, cur_lpips, cur_ms_ssim, cur_alex_lpips = training_report(self.tb_writer, self.iteration, Ll1, loss, l1_loss, self.iter_start.elapsed_time(self.iter_end), self.testing_iterations, self.scene, render, (self.pipe, self.background), self.deform, self.dataset.load2gpu_on_the_fly, self.progress_bar)
File "/home/cdzk/th/train.py", line 100, in training_report
ms_ssim_list.append(ms_ssim(image[None], gt_image[None], data_range=1.).mean())
File "/home/cdzk/anaconda3/envs/th/lib/python3.8/site-packages/pytorch_msssim/ssim.py", line 213, in ms_ssim
assert smaller_side > (win_size - 1) * (
AssertionError: Image size should be larger than 160 due to the 4 downsamplings in ms-ssim
Best PSNR=0.00000 in Iteration 0, SSIM=0.00000, LPIPS=inf, MS-SSIM=0.00000, ALex-LPIPS=inf: 6%| | 5000/80000 [07:55<1:58:52