yzd-v / cls_KD

'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)
Apache License 2.0
217 stars 18 forks source link
image-classification knowledge-distillation pytorch self-knowledge-distillation vision-transformer

Knowledge Distillation for Image Classification

This repository includes official implementation for the following papers:

It also provides unofficial implementation for the following papers:

If this repository is helpful, please give us a star ⭐ and cite relevant papers.

Install

Run

Citing NKD and USKD

@inproceedings{yang2023knowledge,
  title={From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels},
  author={Yang, Zhendong and Zeng, Ailing and Yuan, Chun and Li, Yu},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={17185--17194},
  year={2023}
}

Citing ViTKD

@article{yang2022vitkd,
  title={ViTKD: Practical Guidelines for ViT feature knowledge distillation},
  author={Yang, Zhendong and Li, Zhe and Zeng, Ailing and Li, Zexian and Yuan, Chun and Li, Yu},
  journal={arXiv preprint arXiv:2209.02432},
  year={2022}
}

Acknowledgement

Our code is based on the project MMPretrain.