ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.56k stars 1.38k forks source link

我猜 Whole Word Masking (wwm) 实际是一种更高效的方式,字级别增加预训练时间和随机性 也能达到同样最终效果吧? #187

Closed guotong1988 closed 3 years ago

guotong1988 commented 3 years ago

@ymcui 多谢!

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

guotong1988 commented 3 years ago

@ymcui THX!

ymcui commented 3 years ago

你指的字级别增加预训练时间和随机性很难达到,况且本身算力就是很大的瓶颈,不宜用完全随机性换取有效性。WWM每次都是选取整词进行处理,如果完全依靠随机性,一个句子中恰好所有被mask的位置都是整词的概率会很低。