Closed canghaiyunfan closed 10 months ago
你好,
你的报错似乎是由于库版本的问题,我不确定你用的是哪个版本的,但是我们requirements里标注的以及最新的transformers-bert-embedding
代码中都是有包含position_ids
参数的(如下所示),你可以按照仓库的版本安装transformers
库试试。
self.register_buffer(
"position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)), persistent=False
)
降低transformer 版本解决了,感谢回复
利用作者提供的checkpoints.pt 文件做inference 报错,报错信息如下:
Traceback (most recent call last): File "/data/FCGEC/model/STG-correction/joint_evaluate.py", line 148, in <module> evaluate(args) File "/data/FCGEC/model/STG-correction/joint_evaluate.py", line 48, in evaluate model.load_state_dict(params) File "/data/miniconda3/envs/bert/lib/python3.11/site-packages/torch/nn/modules/module.py", line 2152, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for JointModel: Unexpected key(s) in state_dict: "switch._encoder._bert.embeddings.position_ids", "tagger._encoder._bert.embeddings.position_ids", "generator._lmodel.bert.embeddings.position_ids".