yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
https://yoshitomo-matsubara.net/torchdistill/
MIT License
1.37k stars 132 forks source link

How should I use Torchdistill? #365

Closed 2842193395 closed 1 year ago

2842193395 commented 1 year ago

How should TorchdiStill be used in the project

yoshitomo-matsubara commented 1 year ago

This is not a bug or an issue either.

Please use Discussions (Q&A) above (instead of Issues) for questions . As explained here, I want to keep Issues mainly for bug reports.