-
Hi, congratulations on the release of V2. I found the accompanying paper quite interesting
I have a question regarding the performance differences between two approaches in metric depth estimation …
-
### Search before asking
- [X] I have searched the YOLOv5 [issues](https://github.com/ultralytics/yolov5/issues) and [discussions](https://github.com/ultralytics/yolov5/discussions) and found no simi…
-
How can I use different language models from Hugging Face for knowledge distillation in this set up?
-
Thanks for your work!
I want to know how to train for distillation, what's the model of teacher, how to get the teacher model and what's the input of teacher and student. And what losses used for dis…
-
-
Hi, I'd like to have a few questions on the workflow combining [llm-recipes](https://github.com/Nicolas-BZRD/llm-recipes) with [llm-distillation](https://github.com/Nicolas-BZRD/llm-distillation) to c…
-
Hello, can this distillation model be used for time series models, the dataset I want to process is related to weather prediction, can this be used
-
Hello Author,
I would like to ask if the code first train.py gets the teacher model weight and then preparelabel.py gets the image label of the teacher model to the student model and then distillatio…
-
Hello, when I saw fine-tuning the bge-m3 model, I could use distillation. How do I make a distillation dataset?
-
### Model/Dataset/Scheduler description
Classifier-free guided diffusion models have recently been shown to be highly effective at high-resolution image generation, and they have been widely used in …