ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.57k stars 1.38k forks source link

can not use model by transformers #118

Closed guoxiaojun1999 closed 4 years ago

guoxiaojun1999 commented 4 years ago

I tried to use chinese roberta model according to this url https://huggingface.co/hfl/chinese-roberta-wwm-ext

nlp error 2

But it does not work and raises error like this

nlp error1
ymcui commented 4 years ago

Hi, Please use BertTokenizer and BertModel, as indicated in https://github.com/ymcui/Chinese-BERT-wwm/blob/master/README_EN.md#huggingface-transformers I have just tested under transformers==2.9.1, and everything goes well.

ymcui commented 4 years ago

Close this issue since no further discussions on this topic. Feel free to reopen if necessary.