HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
6.93k stars 905 forks source link

support to numpy.interp. #193

Open rainwoodman opened 7 years ago

rainwoodman commented 7 years ago

We'd like to be able to backtrace the gradient of interp for implementing a toy model. Is anybody working on this?

If not we will file a PR. (cc: @energy)

duvenaud commented 7 years ago

No one is working on it as far as I know, but a PR would be great!

rainwoodman commented 7 years ago

Is it a problem if I pull in a dependency on scipy.sparse? interp is a convolution with a linear kernel -- the easiest (cleanest) implementation is to compute the factors and store them as a sparse matrix, then transpose and multiply.

TimD1 commented 4 years ago

Any updates on this? If not, I'm searching for an efficient autograd-compatible GPU implementation of interpolation. Any idea where to look? I implemented my own in PyTorch, but training is prohibitively slow.

NicolasBlancoV commented 3 years ago

Any updates on this? If not, I'm searching for an efficient autograd-compatible GPU implementation of interpolation. Any idea where to look? I implemented my own in PyTorch, but training is prohibitively slow.

Have you improved something about interpolation autograd-compatible?