HobbitLong / RepDistiller

[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BSD 2-Clause "Simplified" License
2.11k stars 389 forks source link

Training scheme for linear probe on STL10 and TinyImagenet #44

Open 4m4n5 opened 2 years ago

4m4n5 commented 2 years ago

Hello,

I was wondering if you could provide the exact strategy and optimization details for the transferability of representations experiment (Table 4) in the paper.