issues
search
Vincent131499
/
Language_Understanding_based_BERT
基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。
16
stars
5
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
关于mask lm
#1
andiShan11
opened
3 years ago
1