Closed abhi1kumar closed 4 years ago
Hey Abhinav,
This is because you are re-using the variable name conf
(conf = sort_tensor(...)
). I tried and it works if you change the name of the sorted variable.
@abhi1kumar If you can confirm that this solved your problem, we'll close the issue.
Thankyou. Yes, it solves my issue. You can close this thread.
Hi Authors, Thank you for releasing your code. I tried checking the numerical gradients in Pytorch. With the
soft_sort
module, I do not obtain the gradients after doingloss.backward()
. However, when I do not use thesoft_sort
module, I am able to obtain the gradients. Below are the code snippets for the two situationsWith the
soft_sort
module, the outputs are
Without
soft_sort
, the outputs are