Anttwo / SuGaR

[CVPR 2024] Official PyTorch implementation of SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering
https://anttwo.github.io/sugar/
Other
2.36k stars 182 forks source link

bad results for a simple indoor scene #85

Open hhcxx2006 opened 10 months ago

hhcxx2006 commented 10 months ago

Hi, team, I am very excited about your work! Excellent! I want to use your algorithm to reconstruct a indoor scene, but the results are not very good. Could you please tell me whether this is correct? The floor and the walls are not so good.

image

Here is my original data which is collected by realsense d435i. https://drive.google.com/file/d/1jmiOEqhWP99EmIEUKmmq6cgwOeP2yzZb/view?usp=drive_link The command I use is :

  1. python gaussian_splatting/convert.py -s <location>
  2. python gaussian_splatting/train.py -s <path to COLMAP dataset> --iterations 7000 -m <path to the desired output directory>
  3. python train.py -s <path to COLMAP dataset> -c <path to the Gaussian Splatting checkpoint> -r "sdf"

Thank you!

kitmallet commented 10 months ago

@hhcxx2006 it is very likely that your photo set is of a very plain room. Any type of program like gaussian splatting, nerfs, or photogrammetry need textures everywhere to determine where objects are. If a wall is all the same color it cannot tell one point from another easily. Maybe try a data set with more pictures of decorations all over?

hhcxx2006 commented 10 months ago

@kitmallet Okay,thank you for your reply. I will put some wallpaper on the wall and have another try.

Anttwo commented 10 months ago

Hello @hhcxx2006,

Indeed, @kitmallet is entirely right! If you're interested in knowing why texture-less areas are so complicated to reconstruct accurately, you can check my explanations in this previous issue, which had a similar problem with a texture-less sky.

In practice, SuGaR is still able to reconstruct a room with a good-looking hybrid representation, as our results in the playroom scene (from the DeepBlending dataset) show, for instance. However, if you have a lot of texture-less, monochrome walls, their surfaces will probably not be flat but a little chaotic. Feel free to visualize the scene with the SuGaR viewer, as it lets you explore the hybrid representation.

May I ask if your image shows the coarse mesh (ply file with vertex colors), or the refined mesh (obj file with a png texture)? The refined mesh generally looks better, as using vertex colors is a very poor method for coloring a mesh.