xlxwalex / FCGEC

The Corpus & Code for EMNLP 2022 paper "FCGEC: Fine-Grained Corpus for Chinese Grammatical Error Correction" | FCGEC中文语法纠错语料及STG模型
https://aclanthology.org/2022.findings-emnlp.137
Apache License 2.0
104 stars 12 forks source link

Unexpected key(s) in state_dict: "XXX._bert.embeddings.position_ids" #28

Closed canghaiyunfan closed 10 months ago

canghaiyunfan commented 10 months ago

利用作者提供的checkpoints.pt 文件做inference 报错,报错信息如下: Traceback (most recent call last): File "/data/FCGEC/model/STG-correction/joint_evaluate.py", line 148, in <module> evaluate(args) File "/data/FCGEC/model/STG-correction/joint_evaluate.py", line 48, in evaluate model.load_state_dict(params) File "/data/miniconda3/envs/bert/lib/python3.11/site-packages/torch/nn/modules/module.py", line 2152, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for JointModel: Unexpected key(s) in state_dict: "switch._encoder._bert.embeddings.position_ids", "tagger._encoder._bert.embeddings.position_ids", "generator._lmodel.bert.embeddings.position_ids".

xlxwalex commented 10 months ago

你好,

你的报错似乎是由于库版本的问题,我不确定你用的是哪个版本的,但是我们requirements里标注的以及最新的transformers-bert-embedding代码中都是有包含position_ids参数的(如下所示),你可以按照仓库的版本安装transformers库试试。

self.register_buffer(
            "position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)), persistent=False
        )
canghaiyunfan commented 10 months ago

降低transformer 版本解决了,感谢回复