ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.68k stars 1.39k forks source link

关于pipeline #91

Closed guofei1989 closed 4 years ago

guofei1989 commented 4 years ago

纯新人,想问个问题。新版本的Transformer中提供了pipeline接口,可快速将模型应用于"feature-extraction"、"sentiment-analysis"、"ner"、"question-answering"和"fill-mask"等任务。我尝试了在pipeline中直接使用Chinese-BERT-wwm,发现报错,请问是没有提供这项功能吗?

ymcui commented 4 years ago

模型结构上与谷歌原版BERT没有差别。建议你在原版的中文BERT-base上进行尝试,如果也报错的话就说明不是我们模型的问题。

ymcui commented 4 years ago

如有其他问题可以随时reopen