yuanli2333 / Teacher-free-Knowledge-Distillation

Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
MIT License
580 stars 67 forks source link

where's the paper? #10

Closed vraivon closed 4 years ago

vraivon commented 4 years ago

I cannot find Revisiting Knowledge Distillation via Label Smoothing Regularization on arxiv, only Revisit Knowledge Distillation: a Teacher-free Framework. is there some difference between those two papers?

yuanli2333 commented 4 years ago

Hi, it is the same paper. We changed title when submit to CVPR, and it will be changed after CVPR2020 paper be available online.

vraivon commented 4 years ago

thank you