HobbitLong / RepDistiller

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BSD 2-Clause "Simplified" License
2.11k stars 389 forks source link

About deep mutual learning setting #34

Open swlzq opened 3 years ago

swlzq commented 3 years ago

Thank you for your excellent code! I'm interested in "deep mutual learning setting". Can you tell me about some training details?