mmuckley / torchkbnufft

A high-level, easy-to-deploy non-uniform Fast Fourier Transform in PyTorch.
https://torchkbnufft.readthedocs.io/
MIT License
204 stars 44 forks source link

Forward NUFFT not backpropagatable! #22

Closed ajlok3 closed 3 years ago

ajlok3 commented 3 years ago

Hi, I know it's bold claim, but I think there is an issue with the autograd functionality in the forward nufft operation. In order to reproduce the problem I modified the cell [8] in the Basic Example in the following way:

image.requires_grad = True
kdata = nufft_ob(image, ktraj)
kdata.requires_grad

The first time I executed the cell, the output is True, as expected. But every subsequent time, it prints False. I am not an expert in the torch autograd functionality but I would expect the output to have the same requires_grad value as the the input. I also did the same test with the adjoint nufft (cell [10] modified):

kdata.requires_grad = True
image_blurry = adjnufft_ob(kdata, ktraj)
print(image_blurry.requires_grad)

... and the output was always True. The same holds for the examples in v0.3.4. Could it be that the nufft_ob's internal state changes with every call?

Thanks a lot!

mmuckley commented 3 years ago

Hello @ajlok3, should be fixed by PR #23. This was an incredibly weird bug that was caused by me applying unsqueeze to the scaling coefficients. The unsqueeze actually isn't necessary, and the bug seems to have been fixed by removing it.

I've just released an update - I think you can get it by pip install --upgrade torchkbnufft. Let me know if you still have the issue.

ajlok3 commented 3 years ago

Hi @mmuckley, Thank you so much! It works now!