v-mipeng / LexiconAugmentedNER

Reject complicated operations for incorporating lexicon for Chinese NER.
435 stars 72 forks source link

关于根据训练好的模型来预测的问题 #56

Open zhengguanyu opened 2 years ago

zhengguanyu commented 2 years ago

您好,向您请教一个问题

zhengguanyu commented 2 years ago

将status设置成test之后,test_file文件的格式有规定吗? 我将evaluate函数里涉及到的gold_result都删掉了,在带有真实标签的测试集上是可行的。格式如下:

测 O 测 B-XXX 测 E-XXX

相当于对原有的evaluate函数微调一下,也可以预测。 但是我单独做的测试集就是空的。 test_file设置成如下格式:

测 测 测

但是输出的pred_results是空的。 又模仿训练集格式做成如下格式:

测 O 测 O 测 O

输出的pred_results依旧是空的。

请问是不是代码没有提供这部分功能呢? 如果有的话,请指正,感谢

Llin1785361283 commented 2 years ago

@zhengguanyu 您好,我在调用模型进行测试的时候总是显示如下错误,请问您有遇到过类似的错误吗? RuntimeError: Error(s) in loading state_dict for GazLSTM: size mismatch for NERmodel.lstm.weight_ih_l0: copying a param of torch.Size([1200, 250]) from checkpoint, where the shape is torch.Size([800, 250]) in current model. size mismatch for NERmodel.lstm.weight_hh_l0: copying a param of torch.Size([1200, 300]) from checkpoint, where the shape is torch.Size([800, 200]) in current model. size mismatch for NERmodel.lstm.bias_ih_l0: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model. size mismatch for NERmodel.lstm.bias_hh_l0: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model. size mismatch for NERmodel.lstm.weight_ih_l0_reverse: copying a param of torch.Size([1200, 250]) from checkpoint, where the shape is torch.Size([800, 250]) in current model. size mismatch for NERmodel.lstm.weight_hh_l0_reverse: copying a param of torch.Size([1200, 300]) from checkpoint, where the shape is torch.Size([800, 200]) in current model. size mismatch for NERmodel.lstm.bias_ih_l0_reverse: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model. size mismatch for NERmodel.lstm.bias_hh_l0_reverse: copying a param of torch.Size([1200]) from checkpoint, where the shape is torch.Size([800]) in current model. size mismatch for hidden2tag.weight: copying a param of torch.Size([7, 600]) from checkpoint, where the shape is torch.Size([7, 400]) in current model.