issues
search
HobbitLong
/
RepDistiller
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
BSD 2-Clause "Simplified" License
2.11k
stars
389
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Results on ImageNet
#10
xuguodong03
closed
4 years ago
1
In the 2 result tables, WRN-40-2, as the teacher, after distilling the students, the students get higher performance(CRD+KD), why?
#9
splinter21
closed
4 years ago
1
Regression task
#8
xjcvip007
closed
4 years ago
1
Teacher/Student Parameter ratio
#7
iiSeymour
closed
4 years ago
2
A question for Experimental result
#6
baek85
closed
4 years ago
1
The setting of Z_v1 and Z_v2 in class ContrastMemory?
#5
HaoKun-Li
closed
4 years ago
3
AttributeError: 'CIFAR100Instance' object has no attribute 'train_data'
#4
LiqunChen0606
closed
4 years ago
2
Cannot achieve the reported accuracy in paper
#3
xuguodong03
closed
4 years ago
2
Selection of teacher
#2
yaxingwang
closed
4 years ago
2
Fix readme typo
#1
erjanmx
closed
4 years ago
1
Previous