Anttwo / SuGaR

[CVPR 2024] Official PyTorch implementation of SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering
https://anttwo.github.io/sugar/
Other
2.36k stars 182 forks source link

To our great author(s): about refine-mesh speed up #52

Open yuedajiong opened 11 months ago

yuedajiong commented 11 months ago

I found that the 'SHAPE' quality by coarse-train is very good (good-enough), I guess we can do 'TEXTURE' optimization by 'mesh fit by diff-rast-render'. (It is very fast), like this: https://pytorch3d.org/tutorials/fit_textured_mesh I conduct many experiments in this direction, with a bit of intuition, although I haven't actually conducted experiments.

train: 00000 ~ 07000: GS(l=L1+ssim) (by GS, fast) for ... prune_points_by_opacity reset_neighbors 07000 ~ 09000: entropy-loss: binary-opacity (slow, but faster step #3) 09000 ~ 15000: opacity<0.5, then SuGaR-regularization: flatten-gauss-point and align gauss-points to surface (very slow)

coarse: image

refine: image

Anttwo commented 11 months ago

Hello @yuedajiong,

Sure, there are certainly other good ways to refine a traditional UV texture of the mesh! With SuGaR, the goal of the refinement is not only to smooth the mesh/provide a texture but also to build a hybrid representation Mesh+3D Gaussians.

Concerning the refinement time, the default setting in our code is "long" (15k iterations), which takes up to a full hour: it produces the best metrics, but makes the optimization much longer. However, selecting the "short" refinement setting (2k iterations) already provides very good-looking hybrid representations, and only lasts a few minutes! So currently, I think the most straightforward way to speed up refinement is to reduce the number of iterations, as it still provides a good performance/rendering quality.

yuedajiong commented 11 months ago

got it, thanks.