HobbitLong / RepDistiller

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BSD 2-Clause "Simplified" License
2.11k stars 389 forks source link

how to train my model? #35

Open 972461099 opened 3 years ago

972461099 commented 3 years ago

hi, thanks for your wonderful work, I want to train my model in order to reduce size and how to modify my model?