yuanli2333 / Teacher-free-Knowledge-Distillation

Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
MIT License
580 stars 67 forks source link

Data augmentation for Tiny-ImageNet #23

Open aryanasadianuoit opened 3 years ago

aryanasadianuoit commented 3 years ago

Hello,

How have you decided on the data augmentation transformations that you have applied on Tiny-ImageNet? Have you used the setting from some other paper? Thank you in advance.