yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
https://yoshitomo-matsubara.net/torchdistill/
MIT License
1.37k stars 132 forks source link

Knowledge distillation related #476

Closed andynnnnn closed 3 months ago

andynnnnn commented 3 months ago

Hello, it's great to see your achievements. I've been working with large CV models similar to Yolo-world and Grounding Dino, training them on my own dataset to obtain pre-trained files. I'd like to know how I can derive pre-trained files with relatively smaller model parameters while still retaining the Zero-Shot detection capability of the models.

yoshitomo-matsubara commented 3 months ago

Please read https://github.com/yoshitomo-matsubara/torchdistill?tab=readme-ov-file#issues--questions--requests--pull-requests and use Discussions instead

Closing this as it's not a bug.