meituan / YOLOv6

YOLOv6: a single-stage object detection framework dedicated to industrial applications.
GNU General Public License v3.0
5.71k stars 1.03k forks source link

how to use self-distillation, and what's the difference with finetune teacher model with new data #920

Open HaoLiuHust opened 1 year ago

HaoLiuHust commented 1 year ago

Before Asking

Search before asking

Question

I want to train my own dataset, for now, I am trainning on my own dataset with coco weights. What should I do if I want to try self-distillation on this task, what's the teacher model? coco weights or just train once on my own data and then use it as teacher model and train again?

Additional

No response

Chilicyy commented 1 year ago

Hi @HaoLiuHust when you try self-distillation on your task, you can use the first trained model on your own data for teacher model, and use coco weights for finetuning which is specific in the config file.

HaoLiuHust commented 1 year ago

Hi @HaoLiuHust when you try self-distillation on your task, you can use the first trained model on your own data for teacher model, and use coco weights for finetuning which is specific in the config file.

that means use the first trained model as teacher? why this model would be help since the teacher and student are trained from same dataset and init weights