Open clearliu777 opened 3 months ago
Hi, Did you change any code? It seems that the initialization of Gaussians is incorrect. Because the features_dc and features_rest are not of the same length. Could you please check the initialization of both two? I guess one of them is an empty 0-size tensor.
Thank you for your quick response! I do not anything code, I just apply it to myself dataset. Also, when I debug this error, I found that the dim of features_dc and features_rest will change from [x, 0, 3] to [1, x, 1, 3] where x denote the node. Therefore, it will raise the "Sizes of tensors" error.
In the initial phase, torch.cat((features_dc, features_rest), dim=1) == [node, 0, 3].cat([node, 1, 3]) dim=1 However, after "self.iterations_node_sampling = 7500", it will become [1, node, 0, 3].cat(1, [node, 1, 3]), dim=1
I reproduce this error, and print features_dc and features_rest, as shown below.
"Initialization with all pcl. Need to reset the optimizer. [01/04 21:58:38] Initialize Learnable Gaussians for Nodes with Point Clouds! [01/04 21:58:38] Control node initialized with 16 from 16 points. [01/04 21:58:38] torch.Size([204, 1, 3]) ------------ torch.Size([204, 15, 3]) [01/04 21:58:38] (7499 iter) ================================ [01/04 21:58:38] torch.Size([1, 16, 1, 3]) ------------ torch.Size([1, 16, 0, 3]) [01/04 21:58:38] (7500 iter) "
I doubt that the problem is that the initialized Gaussians are not aligned with the true scene content. That's why features_dc becomes zero-shape. On D-NeRF datasets or any other self-captured datasets that COLMAP point clouds are correct, such an error would not be raised.
I suggest you try the solution here: https://github.com/yihua7/SC-GS/issues/12#issuecomment-1980336869. By keeping all points and converting them into Gaussians at the initialization step, the extinction of Gaussians at the first stage may be solved.
If the above method can not solve your problem, you can try --random_init_deform_gs
to initialize Gaussians rather than using COLMAP point clouds. In this way, the initial Gaussians will be uniformly sampled from the cube space from -1 to 1. You can change the code here: https://github.com/yihua7/SC-GS/blob/26cd57d09598b2f5d951029808a5ac9f0ff4f626/train_gui.py#L160 to broaden or shallow the size of initial cube.
However, I strongly doubt that a dynamic Gaussian can be trained on your data since inaccurate COLMAP point cloud means inaccurate camera poses. Anyway, you can try the above solutions and hope this information helps! :)
Thanks for you reply! I will try it again, and report this problem latter.
Traceback (most recent call last): File "/home/cc/3dgs/SC-GS/train_gui.py", line 1886, in
gui.train(args.iterations)
File "/home/cc/3dgs/SC-GS/train_gui.py", line 1000, in train
self.train_node_rendering_step()
File "/home/cc/3dgs/SC-GS/train_gui.py", line 1276, in train_node_rendering_step
render_pkg_re = render(viewpoint_cam, self.deform.deform.as_gaussians, self.pipe, self.background, d_xyz, d_rot, d_scale, random_bg_color=random_bg_color, d_opacity=d_opacity, d_color=d_color, d_rot_as_res=self.deform.d_rot_as_res)
File "/home/cc/3dgs/SC-GS/gaussian_renderer/init.py", line 122, in render
sh_features = torch.cat([pc.get_features[:, :1] + d_color[:, None], pc.get_features[:, 1:]], dim=1) if d_color is not None and type(d_color) is not float else pc.get_features
File "/home/cc/3dgs/SC-GS/scene/gaussian_model.py", line 122, in get_features
return torch.cat((features_dc, features_rest), dim=1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 1 but got size 0 for tensor number 1 in the list.