Open RedHeadM opened 2 years ago
In mmseg.distillation.losses.cwd.ChannelWiseDivergence the student softmax output is not used (also self.name): softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1) Intended or a bug?
softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1)
Thanks for your attention !
The arg is deprecated.
If you want to distill model in OpenMMLab related repos, could join the wechat group in README.md
In mmseg.distillation.losses.cwd.ChannelWiseDivergence the student softmax output is not used (also self.name):
softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1)
Intended or a bug?