issues
search
sseung0703
/
KD_methods_with_TF
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
MIT License
266
stars
61
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
The results of Soft Logits fluctuate quite a lot
#18
zhongshaoyy
opened
5 years ago
3
probably mistaken implementation of RKD methond
#17
wisdom0530
opened
5 years ago
3
Why only update on train_op2 but not train_op?
#16
snownus
closed
5 years ago
4
Choice of SVD gradient
#15
jonkoi
closed
5 years ago
1
which is the paper for KD_EID?
#14
guanfuchen
closed
5 years ago
1
Create LICENSE
#13
sseung0703
closed
5 years ago
0
Can you provide the plot script?
#12
guanfuchen
closed
5 years ago
1
How many epochs did you set to train the teacher model?
#11
Xiaocong6
opened
5 years ago
2
I found a major issue on network and distillation module!
#10
sseung0703
closed
5 years ago
1
Multi label classification
#9
adavoudi
closed
5 years ago
1
hi, i think there were two more different methods for KD
#8
yyht
closed
5 years ago
2
What about TF 2.0?
#7
Oktai15
closed
5 years ago
1
What dataset is used for the experiment?
#6
Xiaocong6
closed
5 years ago
2
Implement of ResNet
#5
cupwater
closed
5 years ago
11
TypeError: ResNet() missing 1 required positional argument: 'label'
#4
CBHealth
closed
5 years ago
9
Update train_w_distill.py
#3
CBHealth
closed
5 years ago
1
AB algorithm fixed
#2
bhheo
closed
5 years ago
0
Can you consider adding the method of attention transfer and neuron-selectivity-transfer?
#1
Xiaocong6
closed
5 years ago
4