ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.57k stars 1.38k forks source link

预训练维基 繁/简体 #132

Closed d223302 closed 4 years ago

d223302 commented 4 years ago

您好: 感谢您提供预训练模型。想请教 BERT-wwm 在进行预训练时,使用的中文维基,是简体中文,还是繁体中文,还是两者都有?

ymcui commented 4 years ago

简体和繁体都包含的,我们并没有把繁体的部分转换成简体。