patrick-kidger / torchcubicspline

Interpolating natural cubic splines. Includes batching, GPU support, support for missing values, evaluating derivatives of the spline, and backpropagation.
Apache License 2.0
198 stars 18 forks source link

Backwards Pass through spline.derivative gives nan gradients #11

Closed kkadry closed 2 years ago

kkadry commented 2 years ago

Hi there, I'm using torchcubicspline as part of a larger optimization workflow where I'm finding the optimal parametric sampling points along a spline (1D vector that spans 0-1) to minimize a loss function. One part of my algorithm finds the tangent to each sampling point, and for some reason, the backwards pass sometimes returns nan gradients at this point during the spline.derivative function call.

Using torch.autograd.detect_anomaly, i found that the problem is specifically in the following line: image Where the error is: image If you require any other information please let me know! The optimization code is a bit large, otherwise, I would have included a minimal code that reproduces the problem. Thanks!

patrick-kidger commented 2 years ago

So I'm afraid without a minimal example to reproduce the problem it'll be difficult to pin this down.

kkadry commented 2 years ago

I'll see if I can try to create one in the next few weeks and open a new issue if I manage Thank you for the response