jl749 / knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
MIT License
0 stars 0 forks source link

change files under mnist/ to modular format+ distillation code reading #3

Closed jl749 closed 2 years ago

jl749 commented 2 years ago

what

mnist directory contains basics code reading + reformatting

why

lots of redundant code (make it easier to read+debug) understand distillation process

TODO