pppppM / mmsegmentation-distiller

This is a knowledge distillation toolbox based on mmsegmentation.
Apache License 2.0
43 stars 13 forks source link

ChannelWiseDivergence unused variable #9

Open RedHeadM opened 2 years ago

RedHeadM commented 2 years ago

In mmseg.distillation.losses.cwd.ChannelWiseDivergence the student softmax output is not used (also self.name): softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1) Intended or a bug?

pppppM commented 2 years ago

Thanks for your attention !

The arg is deprecated.

If you want to distill model in OpenMMLab related repos, could join the wechat group in README.md