issues
search
zhouzaida
/
channel-distillation
PyTorch implementation for Channel Distillation
98
stars
17
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
在训练开始的时候不计算ce和kd的loss, 只算cd的loss?
#10
CatDroid
opened
1 month ago
3
About the initializing of the student model
#9
swagshaw
opened
1 year ago
2
correct_t[correct_t == 0.0] = 1.0
#8
XiXiRuPan
closed
3 years ago
1
模型
#7
zbh201
closed
3 years ago
4
channel-wise attention
#6
xiaohui-hzlx
closed
4 years ago
1
question about function adjust_loss_alpha
#5
UcanSee
closed
4 years ago
1
About channel_distillation.py
#4
Youskrpig
closed
4 years ago
2
training detail
#3
YuQi9797
closed
4 years ago
9
a question about Implement Details
#2
YuQi9797
closed
4 years ago
0
Trained model
#1
wzjiang
closed
4 years ago
1