ymcui / MacBERT

Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
https://www.aclweb.org/anthology/2020.findings-emnlp.58/
Apache License 2.0
645 stars 59 forks source link

请问MacBERT的参数和RoBERTa-wwm的参数是兼容的吗? #4

Closed daizh closed 3 years ago

daizh commented 3 years ago

原先适配RoBERTa-wwm的模型代码是否直接换一个MacBERT的checkpoint就能直接fine-tune呢?

ymcui commented 3 years ago

可以的,finetune方面可以直接替换,主体架构是一样的。

daizh commented 3 years ago

好的,谢谢您