mabaorui / NeuralPull

Implementation of ICML'2021:Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces
MIT License
179 stars 28 forks source link

Higher-order gradient #6

Closed ThibaultGROUEIX closed 2 years ago

ThibaultGROUEIX commented 2 years ago

Hi @mabaorui,

Congrats again and sorry for the repeated queries. I am still trying to figure out a few aspects of the method and am thus opening various discussions, I hope that is ok.

I wonder if you use higher-order gradients to backprop in equation 1 of the main paper. I noticed that the pytorch version does not use higher order gradients. And I have hardly ever used Tensorflow in my life, so it is hard for me to understand the official code.

Beside the answer, I wonder what are the pros and cons of using higher-order gradients in this equation, and how you made this design decision,

Thanks a lot, Thibault

mabaorui commented 2 years ago

Hi @ThibaultGROUEIX , Thank you for your question and I am glad to discuss it with you. I does not use higher order gradinets. The gradinets at query point q is the direction of the fastest signed distance increase in 3D space. Therefore, it is reasonable(or explainable) to move a query point along or against the direction of gradient. I did not try to use higher-order gradients, because I have no idea about the relationship between higher-order gradients and the movement of query point. But your thoughts are very interesting. If you have any reasonable way to apply higher-order gradients to help reconstruction, I am very happy to hear your thoughts.

ThibaultGROUEIX commented 2 years ago

Thanks @mabaorui Yes, I think using higher-order gradients would be an improvement. It helps if the gradient is not pointing in the direction of the corresponding point. best regards Thbault