issues
search
karanchahal
/
distiller
A large scale study of Knowledge Distillation.
217
stars
30
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Using Distillation on a differenta dataset using a trained teacher.
#5
Gharibw
opened
4 years ago
2
Is it possible to train a smaller model (student)?
#4
JuanDavidG1997
closed
4 years ago
1
Is it possible to keep the learing rate constant?
#3
JuanDavidG1997
closed
4 years ago
3
Trouble training teaches
#2
JuanDavidG1997
closed
4 years ago
4
Pytorch lightning?
#1
reactivetype
opened
4 years ago
2