NVIDIAGameWorks / kaolin-wisp

NVIDIA Kaolin Wisp is a PyTorch library powered by NVIDIA Kaolin Core to work with neural fields (including NeRFs, NGLOD, instant-ngp and VQAD).
Other
1.46k stars 131 forks source link

Does wisp._C.ops.hashgrid_interpolate_cuda provide gradients w.r.t. coordinates? #41

Closed alvaro-budria closed 1 year ago

alvaro-budria commented 2 years ago

I am trying to enhance NeuS with a hash grid encoding. NeuS utilizes the normal of the SDF as input to its color field, and this normal is computed as the gradient of the SDF w.r.t. the coordinates. That means that if I implement the SDF field with a hash grid, I will need to be able to differentiate the hash grid not only w.r.t. the features, but also the coordinates. It looks like this is not currently supported in kaolin-wisp nor in kaolin, as I am getting None when I try to obtain the gradient w.r.t. the input coordinates.

Is it planned to add this capability to kaolin-wisp?

ghy0324 commented 2 years ago

@Caenorst @tovacinni @orperel I also have this requirement, and I think it is important especially for research based on SDF. Do you have a plan to implement it? Thanks a lot!

alvaro-budria commented 2 years ago

It seems that NVlabs' tiny-cuda-nn framework supports the computation of the gradient w.r.t to the input coordinates: method kernel_grid_backward_input in https://github.com/NVlabs/tiny-cuda-nn/blob/master/include/tiny-cuda-nn/encodings/grid.h.

I think it could be easily adapted to kaolin-wisp.

orperel commented 2 years ago

Hi @ghy0324 , @alvaro-budria We just fixed the missing Jacobian for trilinear interpolation in kaolin (should be pushed very soon). That fix would affect the Octree, Codebook and Triplane in wisp, but not the hashgrid which we'll address separately.

Regarding tinycudann support, we indeed plan to officially have tinycudann support. Until we do so your suggestion is valid: you can just install the torch binding and use it in your BaseNeuralField implementation.

orperel commented 1 year ago

Following PR #94 Wisp's HashGrid should support gradients with respects to the coordinates now.

Make sure you run python setup.py develop after pulling, to rebuild the CUDA kernels

Hippogriff commented 1 year ago

@orperel It seems that the current version still doesn't return gradient w.r.t input coordinates. Is it fixed?

Hippogriff commented 1 year ago

It seems that NVlabs' tiny-cuda-nn framework supports the computation of the gradient w.r.t to the input coordinates: method kernel_grid_backward_input in https://github.com/NVlabs/tiny-cuda-nn/blob/master/include/tiny-cuda-nn/encodings/grid.h.

I think it could be easily adapted to kaolin-wisp.

@alvaro-budria Hey, were you able to get gradients using tinycuda? I tried it but the gradient returned by their implementation seems to differ from numerical gradients.

alvaro-budria commented 1 year ago

@Hippogriff I have not tried contrasting the gradients from tinycudann with numerical gradients, as it seemed to work fine in my experiments. If you found a problem there you should probably reopen this issue or even better create a new one with a code example.