issues
search
haitongli
/
knowledge-distillation-pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
MIT License
1.86k
stars
344
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
kd loss
#2
PaTricksStar
closed
6 years ago
21
cant get the pretrained model
#1
flygyyy
closed
6 years ago
2
Previous