ymcui / MacBERT

Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
https://www.aclweb.org/anthology/2020.findings-emnlp.58/
Apache License 2.0
639 stars 56 forks source link

transformers中调用输入文本数据代码在哪呀 #21

Closed Mr-IT007 closed 1 year ago

Mr-IT007 commented 1 year ago

transformers中调用输入文本数据代码在哪呀

stale[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 1 year ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.