ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.57k stars 1.38k forks source link

Chinese-BERT-wwm fine-tuning #18

Closed nietao2 closed 5 years ago

nietao2 commented 5 years ago

请问使用Chinese-BERT-wwm做fine-tuning的时候需要用LTP做分词吗?

ymcui commented 5 years ago

不需要的,和普通的BERT一样,使用的是Char级别的Tokenizer。 Whole Word Masking只影响预训练阶段的Mask方式,并不改变下游任务的输入方式。

nietao2 commented 5 years ago

谢谢!

guotong1988 commented 4 years ago

为什么fine-tune的时候不需要分词?