kaiwang960112 / Self-Cure-Network

This is a novel and easy method for annotation uncertainties.
404 stars 98 forks source link

关于self-attention模块的问题 #77

Closed hanluyt closed 6 months ago

hanluyt commented 1 year ago

作者您好,您在文中用了self-attention模块来学习每张图的weight, 我参考了您的代码,在一开始训练时每张图的weight各有不同,但在迭代完一个epoch时,每张图的weight都趋近于1了,我想请问一下您之前是否出现过这种情况,或者有什么解决方法呢,谢谢!

kaiwang960112 commented 1 year ago

需要加一下loss,让这个权重有差异,论文里应该提到了这点

发自我的iPhone

在 2022年9月13日,下午1:08,hanluyt @.***> 写道:

 作者您好,您在文中用了self-attention模块来学习每张图的weight, 我参考了您的代码,在一开始训练时每张图的weight各有不同,但在迭代完一个epoch时,每张图的weight都趋近于1了,我想请问一下您之前是否出现过这种情况,或者有什么解决方法呢,谢谢!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.

hanluyt commented 1 year ago

好的,谢谢您的答复

Yiqin-Luo commented 1 year ago

您好!请问您还保存了SCN的完整代码么?可以分享一下么?非常感谢!