Open HaoLiuHust opened 1 year ago
Hi @HaoLiuHust when you try self-distillation on your task, you can use the first trained model on your own data for teacher model, and use coco weights for finetuning which is specific in the config file.
Hi @HaoLiuHust when you try self-distillation on your task, you can use the first trained model on your own data for teacher model, and use coco weights for finetuning which is specific in the config file.
that means use the first trained model as teacher? why this model would be help since the teacher and student are trained from same dataset and init weights
Before Asking
[X] I have read the README carefully. 我已经仔细阅读了README上的操作指引。
[X] I want to train my custom dataset, and I have read the tutorials for training your custom data carefully and organize my dataset correctly; (FYI: We recommand you to apply the config files of xx_finetune.py.) 我想训练自定义数据集,我已经仔细阅读了训练自定义数据的教程,以及按照正确的目录结构存放数据集。(FYI: 我们推荐使用xx_finetune.py等配置文件训练自定义数据集。)
[X] I have pulled the latest code of main branch to run again and the problem still existed. 我已经拉取了主分支上最新的代码,重新运行之后,问题仍不能解决。
Search before asking
Question
I want to train my own dataset, for now, I am trainning on my own dataset with coco weights. What should I do if I want to try self-distillation on this task, what's the teacher model? coco weights or just train once on my own data and then use it as teacher model and train again?
Additional
No response