issues
search
wzzzd
/
pretrain_bert_with_maskLM
使用Mask LM预训练任务来预训练Bert模型。训练垂直领域语料的模型表征,提升下游任务的表现。
41
stars
11
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
保存的训练模型为什么只有不到90MB?而原始的BERT有300多MB,下游任务直接读取这个模型吗?
#10
Mirage7320
closed
5 months ago
2
测试准确率
#9
scofield687
opened
1 year ago
0
请教多卡训练问题
#8
xiaozhu1106
opened
1 year ago
3
请教词嵌入
#7
hongxiaDu
opened
2 years ago
4
请教个人数据集预训练的问题
#6
minlik
closed
2 years ago
4
请教关于自己使用的数据集的问题
#5
zyxhangzhou
closed
2 years ago
0
请教wwm的分词问题
#4
jarork
opened
2 years ago
4
关于制作预训练数据的时间优化
#3
jarork
opened
2 years ago
1
金标准有问题
#2
jarork
closed
2 years ago
3
Can I transfer to TF?
#1
quan3
opened
3 years ago
0