Below is an example of a comparison of gradient returned by tinycudann vs numerical gradients computed for a simple field based on hash grid. The difference between the two approaches is huge.
I tried using a pure Pytorch implementation of hashgrid. This approach gives correct gradients using autograd when compared with numerical gradients.
Below is an example of a comparison of gradient returned by tinycudann vs numerical gradients computed for a simple field based on hash grid. The difference between the two approaches is huge. I tried using a pure Pytorch implementation of hashgrid. This approach gives correct gradients using autograd when compared with numerical gradients.