yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
https://yoshitomo-matsubara.net/torchdistill/
MIT License
1.37k stars 132 forks source link

more concise #466

Closed MostHumble closed 4 months ago

yoshitomo-matsubara commented 4 months ago

Hi @MostHumble

Thank you for the PR. I think it is just a very minor style change rather than an important change, thus close this PR. Next time please start with an issue.