ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.66k stars 1.39k forks source link

从BERT换成此模型后跑不通,应该怎么解决,显示缺少参数 #242

Open ottt08 opened 10 months ago

ottt08 commented 10 months ago

_from_pretrained() missing 1 required positional argument: 'init_configuration' File "C:\Users\10553\Desktop\chinese_ner-main\torch_ner\source\ner_main.py", line 63, in train tokenizer = BertTokenizer.from_pretrained("chinese-roberta-wwm-ext-large", do_lower_case=self.config.do_lower_case) File "C:\Users\10553\Desktop\chinese_ner-main\torch_ner\source\ner_main.py", line 258, in NerMain().train()

stale[bot] commented 9 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.