Closed hityzy1122 closed 2 years ago
Hi thanks for your code! I got an error as the title when call loss.backward, I fixed it when I changed self.save_for_backward(tenIn, tenFlow) into self.save_for_backward(tenIn.clone(), tenFlow.clone()), I don't know if it is right?
What is the layer before and the layer after the softmax splatting in your network?
Closing due to inactivity, please feel to reopen if this issue still persists. Thanks!
自动回复:邮件已收到
Hi thanks for your code! I got an error as the title when call loss.backward, I fixed it when I changed self.save_for_backward(tenIn, tenFlow) into self.save_for_backward(tenIn.clone(), tenFlow.clone()), I don't know if it is right?