ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.67k stars 1.39k forks source link

The exact English pretraining data and Chinese pretraining data that are exact same to the BERT paper's pretraining data. #176

Closed guotong1988 closed 3 years ago

guotong1988 commented 3 years ago

Any one know where to get them? Thank you and thank you.

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

guotong1988 commented 3 years ago

@ymcui Thank you very much!

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.