-
### Search before asking
- [X] I have searched the Ultralytics [issues](https://github.com/ultralytics/ultralytics/issues) and found no similar feature requests.
### Description
Hi everyone…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-
Hi.
Did you publish the code for the Knowledge Distillation loss? I couldn't find it in the code.
If it is not there, could you please publish the code?
Thanks
-
I've noticed that this knowledge distillation is somewhat similar to what is mentioned in EAGLE, and it has proven to be very effective. I would like to know if you have tried the knowledge distillati…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-
Current knowledge distillation recipes don't have support for activation offloading and opt_in_bwd.
The implementation should be similar to the one in other recipes, like full_finetuning_distribute…
-
### Method description
Hi team, I have implemented some distillation based trainers, and would like contribute them to `trl`. Do you accept contributions on this or probably this is something already…
-
Knowledge distillation: A good teacher is patient and consistent
tensorflow:
https://github.com/google-research/big_vision/tree/main/big_vision/configs/proj/distill
Do you have plans to open …
-
May I ask if there is any knowledge distillation related work in this project.if not, have suggestions?
-
hello, Knowledge Distillation in code, best for Detection?