issues
search
imirzadeh
/
Teacher-Assistant-Knowledge-Distillation
Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf
MIT License
256
stars
47
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
the performance of plain10 and plain2 on cifar100
#23
cotyyang
opened
2 years ago
0
I want to know nni version
#22
TaiseiYamana
closed
3 years ago
0
nni issue
#21
arthur0219
opened
3 years ago
0
Implementation of DML as a benchmark
#20
aryanasadianuoit
opened
3 years ago
0
Three main issues: seed optimization, parameter tuning on the test set, and wrong normalization for cifar10
#19
tknovisky
opened
3 years ago
1
ImageNet - ResNet experiment
#18
wonchulSon
opened
4 years ago
0
TA training parameters for CIFAR-100 experiment
#17
ghost
opened
4 years ago
2
maybe one mistake
#16
jiangmijiangmi
opened
4 years ago
2
Baseline training
#15
ming666-wum
opened
4 years ago
3
Tensorboard web UI
#14
mlleo
closed
4 years ago
3
Experimental results
#13
MaorunZhang
closed
4 years ago
5
excuse me ,how can i solve this problem
#12
storm-zhuo
closed
4 years ago
0
Baseline training
#11
peterbj95
opened
4 years ago
3
Cifar 10 training - resenet 110 to resnet 8
#10
karanchahal
closed
4 years ago
6
Add Licence
#9
imirzadeh
closed
5 years ago
0
Teacher (resnet26) best accuracy
#8
TeerathChandani
closed
5 years ago
14
Optimized values
#7
TeerathChandani
closed
5 years ago
12
command not found: nnictl
#6
TeerathChandani
closed
5 years ago
2
Error while running the code
#5
TeerathChandani
closed
5 years ago
1
Choice of Loss Function
#4
adrianloy
closed
5 years ago
2
Student's performance on resnet8
#3
aminshabani
closed
5 years ago
1
Training Issue (problem with nni)
#2
userb2020
closed
5 years ago
3
'resnet 110' as teacher, 'resnet20' as TA, 'resnet8' as student on CIFAR100’
#1
InstantWindy
closed
5 years ago
5