Open RangeKing opened 1 year ago
Sorry for the delayed response and thank you for your advice. Self-distillation is a widely used technique that limits the teacher to be the student itself. It's a great idea, but we haven't been able to support it yet due to manpower shortages. We'll appreciate it if you can open a pull request. And here is an example to use distillation on rtmdet.
Describe the feature
YOLOv6 self-distillation
Motivation
Self-distillation is a common practice s to improve performance.
Related resources
https://github.com/meituan/YOLOv6