Hi, I'm modifying torch-ngp code to create a model that updates the pose. I created a new custom autograd function, with forward and backward. However, I wonder why the backward I defined is not executed even though the forward is called.
In a similar context, I wonder why the backward of the _sh_encoder class is not executed, even though the forward of the _sh_encoder was executed. This is my first time to implement custom autograd function, so it would be very helpful if you could explain it please... :)
Thanks a lot :)
Hi, I'm modifying torch-ngp code to create a model that updates the pose. I created a new custom autograd function, with forward and backward. However, I wonder why the backward I defined is not executed even though the forward is called. In a similar context, I wonder why the backward of the _sh_encoder class is not executed, even though the forward of the _sh_encoder was executed. This is my first time to implement custom autograd function, so it would be very helpful if you could explain it please... :) Thanks a lot :)