Open JerryPW opened 9 months ago
Hi, I know this bug. This bug is caused by having too few viewpoints or insufficient viewpoint coverage.
Point-based rendering requires a rich array of viewpoints to help it converge. Using a single view under Gaussian rasterization will definitely lead to this issue. You could remove the deformation and train your dataset using only vanilla 3D Gaussians, and I believe the same bug will occur.
I think that if you use a pipeline based on the official differentiable Gaussian rasterization, this bug will always appear.
Thanks for your reply! It really helps! BTW, if I really want to use 3DGS in single view dataset, what can I do? Do you think adding a depth loss will be useful?
I think that 3D-GS relies more on the richness of viewpoints compared to NeRF. For example, 20 viewpoints might not be sparse for NeRF, but they are sparse for 3D-GS. This is because point-based rendering requires 360 degree viewpoints to help converge its geometry, otherwise it's very prone to oom.
Depth loss is certainly useful for few-shot learning, and there are already many papers on arxiv. But I think the quality is not sufficient to reach a usable level. I believe that reconfusion provides us with new ideas for few-shot learning and even single view learning.
OK, I understand. Thanks to your time for explanation!
Hi, there. I have encountered the same issue with a single-view image as well. It seems like this was caused by the alpha-reset starting when iteration=3000, and something bad occurs when density for every 100 iterations.
Perhaps we could adjust the arguments to fit such single-view scene optimizing. But I haven't locate the specific problem yet. Thus I have no idea which parameters will directly affect this issue either.
Please let me konw if you make any new discoveries, thank you.
Hi, the issue you mentioned also occurs in vanilla 3D-GS. I believe it might originate from the implementation of 3D-GS itself, as I've encountered similar errors when testing scenes with fewer training viewpoints.
If the problem is indeed caused by the opacity reset as you suggested, I think you could try disabling the opacity reset to attempt pinpointing the issue. For example, you could change the opacity reset interval here to 100000. I believe that opacity reset is just a trick in implementation, primarily used to remove floaters in the rendering of real-world scenes. Removing this feature should not impact Blender scenes.
Thank you for your kindly reply, this is indeed due to the implementation of 3D-GS itself. The specific location where the error occurred is intriguing.
Hi! Thanks for your great work in 3DGS! However, as I tried your code in my custom dataset(single view datasets), it encountered an error and I can't figure out why.
It seems gradient disappear after 3100 iterations and it probably caused by
densification_interval
afteropacity_reset
. I've also tried other dataset like DyNeRF and it is OK with it. So, does it because of the single view and the model can't handle it well, or is there any other reason to illustrate this problem?Looking forward to your reply soon!