yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
https://yoshitomo-matsubara.net/torchdistill/
MIT License
1.37k stars 132 forks source link

[BUG] Missing Link in Readme #389

Closed m-parchami closed 1 year ago

m-parchami commented 1 year ago

Hi, Thanks for the great repo! Saves us a lot of time!

I wanted to download checkpoints from ILSVRC2012 R34 -> R18 distillation, but the last column (KR: Knowledge Review) checkpoint seem to be missing? The top-1 accuracy is mentioned on the main README but on the second README (in the Imagenet folder) it disappears.

Thanks a lot. Amin

yoshitomo-matsubara commented 1 year ago

Hello @m-parchami ,

I implemented KR after the torchdistill paper came out, and that's why the official directory does not include the config and checkpoint for KR. You can find the checkpoint in the release note https://github.com/yoshitomo-matsubara/torchdistill/releases/tag/v0.2.5

And the configuration file is here https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/ilsvrc2012/kr/resnet18_from_resnet34.yaml