bennyguo / instant-nsr-pl

Neural Surface reconstruction based on Instant-NGP. Efficient and customizable boilerplate for your research projects. Train NeuS in 10min!
MIT License
857 stars 84 forks source link

About the loss functions used in system/neus.py. #74

Closed ZhouWeikun closed 1 year ago

ZhouWeikun commented 1 year ago

First, thanks for your work. But i have some questions about the functions you used in neus: 1.loss_sparsity = torch.exp(-self.config.system.loss.sparsity_scale * out['sdf_samples'].abs()).mean() 2.loss_opaque = binary_cross_entropy(opacity, opacity) Could you give more info about them, or papers related to them?

bennyguo commented 1 year ago

Hi! The sparsity loss is proposed by SparseNeuS and is used to remove unnecessary occupancy in the few-shot setting:

image

I tried training with this loss but found no obvious difference in the dense-capture setting (e.g., NeRF-Synthetic).

The opaque loss is to encourage the opacity to be either 0 or 1. It is used in my VMesh paper:

image

I found this loss to be improving surface quality for NeuS in some scenarios.

Hope this helps!

ZhouWeikun commented 1 year ago

Thanks for your reply !