-
Hey I am very interested in this work, and have some questions to ask.
I used 20 images per class in MINIST dataset-distillation by using
`python main.py --mode distill_basic --dataset MNIST --arch…
-
### Prerequisite
- [X] I have searched [Issues](https://github.com/open-mmlab/mmdetection/issues) and [Discussions](https://github.com/open-mmlab/mmdetection/discussions) but cannot get the expected …
-
Thanks for this amazing work. The current command line example shows how to create a model using the kd process and a single dataset--ie.e, CIFAR10. However, I am trying to create a student model (usi…
-
target task: summarization
distillation: teacher → student (draft model)
t5-xl: target, t5-small: drafter
n-gram: ...?
Ngram: KD
ngram should be trained with the model generated dataset.
…
-
Hi, I'm curious about how Nomos-v2 is collected? What's more, is it better than other datasets? Is there a benchmark on performance?
-
It is observed on GitHub that the source code is incomplete and the data set is missing. Can you disclose all the source code and data set for easy reproduction and attempt? I look forward to the auth…
-
Hello! I am new to dataset distillation and my question may be shallow. It seems to me that dataset distillation is generally for classification tasks, synthesizing condensed data for each class for e…
-
Hello~ Could you please release the dataset for distillation? It will be nice of you to release the dataset~
-
Hi!
I came across this library very recently and i am loving it! In my current research I am trying to implement knowledge distillation, which requires multiple datasets to be passed in, here a singl…
-
Hello, I would like to ask, where is the method of training this code with a distillation algorithm? Because I see that the network of teachers and students is the same, so there is no distillation, r…