-
Knowledge distillation: A good teacher is patient and consistent
tensorflow:
https://github.com/google-research/big_vision/tree/main/big_vision/configs/proj/distill
Do you have plans to open …
-
May I ask if there is any knowledge distillation related work in this project.if not, have suggestions?
-
hello, Knowledge Distillation in code, best for Detection?
-
Hello, I would like to ask, where is the method of training this code with a distillation algorithm? Because I see that the network of teachers and students is the same, so there is no distillation, r…
-
Hi!
I came across this library very recently and i am loving it! In my current research I am trying to implement knowledge distillation, which requires multiple datasets to be passed in, here a singl…
-
Hello, Author,
Your paper is very nice, thanks for sharing. I have a confusion is that I can't find which part of the codes shows the knowledge distillation.
-
Your project is so good.If I want to use knowledge distillation to teach your ViT-Adapter-S model to learn the human semantic segmentation effect of ViT-Adapter-L, what should I be mindful of.
-
[”Distilling the Knowledge in a Neural Network](https://link.zhihu.com/?target=https%3A//arxiv.org/abs/1503.02531)
[Prakhar Ganesh. "Knowledge Distillation : Simplified"](https://towardsdatascience…
-
"[Object detection at 200 Frames Per Second] (https://arxiv.org/pdf/1805.06361.pdf)" In this paper, you can see a significant improvement in the performance of "tiny-yolov2".
Is there a way to use th…
-
First of all, very interesting and relevant work!
In your [review rebuttal](https://openreview.net/forum?id=OFMPrCAMKi¬eId=wjbkmo4Wrp) you stated that you were planning on adding Mask2Former to …