kaiwang960112 / Self-Cure-Network

This is a novel and easy method for annotation uncertainties.
404 stars 98 forks source link

attention module outputs always ~=0 #37

Closed zhanglaplace closed 6 months ago

zhanglaplace commented 3 years ago

你好,这边训练affectnet的模型,模型选择的是resnet50,attention的输出基本都为0,因此high_group和low_group的diff基本都为0。