aliutkus / torchinterp1d

1D interpolation for pytorch
BSD 3-Clause "New" or "Revised" License
162 stars 19 forks source link

RuntimeError: No grad accumulator for a saved leaf! #16

Open diyiiyiii opened 1 year ago

diyiiyiii commented 1 year ago

Thanks for your excellent work! I meet an error about backward grad issue, does this project support back propagation operation?

aliutkus commented 1 year ago

Hi ! Normally this should, have you pinned thé problem down? Is there an error message?

pwangcs commented 1 year ago

Hi ! Normally this should, have you pinned thé problem down? Is there an error message?

Hi, @aliutkus, I got a same error when I use interp1d to train a network. The error message is as follows,

Traceback (most recent call last): File "/home/wangping/Codes/DeepOpticsSCI/E2E_train.py", line 296, in main(E2Enet, optimizer, args) File "/home/wangping/Codes/DeepOpticsSCI/E2E_train.py", line 260, in main train(epoch, result_path, model, optimizer, logger, args) File "/home/wangping/Codes/DeepOpticsSCI/E2E_train.py", line 131, in train Loss_all.backward() File "/home/wangping/anaconda3/envs/sci_pytorch/lib/python3.9/site-packages/torch/tensor.py", line 245, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs) File "/home/wangping/anaconda3/envs/sci_pytorch/lib/python3.9/site-packages/torch/autograd/init.py", line 145, in backward Variable._execution_engine.run_backward( File "/home/wangping/anaconda3/envs/sci_pytorch/lib/python3.9/site-packages/torch/autograd/function.py", line 89, in apply return self._forward_cls.backward(self, *args) # type: ignore File "/home/wangping/Codes/DeepOpticsSCI/utils.py", line 159, in backward inputs = ctx.saved_tensors[1:] RuntimeError: No grad accumulator for a saved leaf!

In my utils.py, code related with interp1d is as

def crf_3d(x): # x : a torch.Tensor with size [batch, H, W] batch, H, W = x.size() x = x.view(batch*H, W) E = torch.linspace(0.0, 1.0, steps= 1024, requires_grad=False).to(x.device)
f0 = parse_emor() # f0 is a numpy.array loaded from the local .txt file I = torch.from_numpy(f0).to(x.device)
y = interp1d(E, I, x) y = y.view(batch, H, W) return y

My codes run under pytorch=1.8.1, Python=3.9.4. Please let me know what's wrong? Thanks in advance.