Closed ximi1211 closed 1 year ago
Hi @ximi1211!
Thank you for letting me know about this issue. In the last few hours, I've analyzed the cause of this problem. Finally, I figured out that one line of code that I rewrote to clean up before public release caused a problem. The equation is the same, but I still don't know why nan appears during back-propagation (I replaced the loss with 0 if its value is nan). Please replace the 59th line in model_groupwrapper.py
with the PyTorch built-in function as follows:
# sig = 1 / (1 + ((dist_mat - self.th) / tau).exp())
sig = (-(dist_mat - self.th) / tau).sigmoid()
I'm closing this issue for now. Please feel free to open another issue for other questions.
Hello, may I ask why my loss is always 0 since the first epoch?
![image](https://user-images.githubusercontent.com/80146128/187010679-f4e889ef-2228-44ca-a3e1-554af86766b4.png)