issues
search
Lisennlp
/
TinyBert
简洁易用版TinyBert:基于Bert进行知识蒸馏的预训练语言模型
251
stars
49
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
关于维度蒸馏的问题
#14
Wanan-ni
opened
9 months ago
0
数据格式
#13
zhanghanweii
opened
1 year ago
1
您好,我想知道general.sh里面的student_model最开始是什么内容呢,有一点没弄懂
#12
qingnuan
opened
2 years ago
11
general distillation为什么也用的是task data, 不应该用general data吗?
#11
Vincent-Ww
opened
2 years ago
1
the trained model download url is 404
#10
matrix-yang
opened
2 years ago
0
中文任务的数据增强
#9
pong991
opened
2 years ago
1
中文数据增强
#8
ysujiang
closed
2 years ago
2
训练步骤请教
#7
ysujiang
closed
2 years ago
1
蒸馏后的模型推理速度是否提升
#6
qdchenxiaoyan
closed
2 years ago
1
请问teacher和student的hidden size不一样在算MSE的时候是怎么处理的?
#5
lbe0613
closed
2 years ago
2
通用的中文TinyBERt模型求助
#4
lengfeng343
closed
3 years ago
1
Can I use it to distill roberta?
#3
Tweakzx
closed
3 years ago
1
Is there any format converting on corpus (like the step 1 in general distillation in original TinyBERT repo)?
#2
janezzzz
closed
3 years ago
1
Can you provide the script to evaluate the model!Thank you!
#1
yzq170320
closed
3 years ago
2