sniklaus / softmax-splatting

an implementation of softmax splatting for differentiable forward warping using PyTorch
469 stars 58 forks source link

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation #50

Closed hityzy1122 closed 2 years ago

hityzy1122 commented 2 years ago

Hi thanks for your code! I got an error as the title when call loss.backward, I fixed it when I changed self.save_for_backward(tenIn, tenFlow) into self.save_for_backward(tenIn.clone(), tenFlow.clone()), I don't know if it is right?

sniklaus commented 2 years ago

What is the layer before and the layer after the softmax splatting in your network?

sniklaus commented 2 years ago

Closing due to inactivity, please feel to reopen if this issue still persists. Thanks!

hityzy1122 commented 2 years ago

自动回复:邮件已收到