Closed dr-GitHub-account closed 2 years ago
这些模型和BERT的结构是一样的,你可以直接用这些模型作为初始权重,继续训练MLM任务就可以了。 transformres库里有训练BERT的方法,还请自行查找解决。
好的,谢谢!我去查找一下。
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.
在下游任务NER上,直接对 huggingface 下载下来的基于 PyTorch 的中文整词掩码预训练模型 (chinese-bert-wwm-ext, chinese-roberta-wwm-ext, chinese-roberta-wwm-ext-large等) 进行微调,能得到不错的结果。现在想先在下游任务的语料上做无监督的 adaptive fine-tuning (相当于用下游任务的数据接着做一次MLM预训练),再在下游任务微调,试试能不能有更好的结果。请问有实现这个的代码吗?