ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.56k stars 1.38k forks source link

继续预训练 #233

Closed yyggano closed 12 months ago

yyggano commented 1 year ago

如果我像按Roberta-wwm的方式在新的数据集上继续预训练(wwm和动态mask),是不是实现不了,因为没有开源原代码

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 12 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.