issues
search
murray-z
/
knowledge_distillation
knowledge distillation: 采用知识蒸馏,训练bert后指导textcnn
16
stars
7
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
蒸馏的效果没有直接用数据训练TextCNN的效果好?
#2
czhxiaohuihui
opened
3 years ago
2
您好,实验数据可以提供一下吗?觉得您写的代码很有水平,想仔细研究一下
#1
Batman001
closed
3 years ago
1