kmkurn / pytorch-crf

(Linear-chain) Conditional random field in PyTorch.
https://pytorch-crf.readthedocs.io
MIT License
935 stars 151 forks source link

代码错误 #82

Closed Wsl-hfut-nwpu closed 2 years ago

Wsl-hfut-nwpu commented 2 years ago

我认为你的代码有问题,当初始化的时候传入的参数batch_first=True之后,在后面的代码中并未考虑batch翻转的情况,因为seq_length, batch_size = tags.shape对batch在第二个维度进行了限制。

kmkurn commented 2 years ago

Hi, I don't know Mandarin (I assume that's the language you wrote, sorry if I'm wrong) so I can't understand your text. Can you write it in English?

Emperorizzis commented 2 years ago

Hi, I don't know Mandarin (I assume that's the language you wrote, sorry if I'm wrong) so I can't understand your text. Can you write it in English?

Hi~ Let me translate for you, he spoke in Chinese. What he means: "I think there is an error with your code. When the parameter 'batch_first=True' is passed during initialization, the subsequent code doesn't consider the batch flipping, because 'seq_length, batch_size=tags.shape' limits batch in the second dimention." Hope it will help you understand~

kmkurn commented 2 years ago

@Emperorizzis Thanks for translating!

@Wsl-hfut-nwpu Are you referring to this line? https://github.com/kmkurn/pytorch-crf/blob/e843718540512eb04d9b756fca04a0915affe175/torchcrf/__init__.py#L181 That method is only called after the batch and seq length dimension is swapped. If you think otherwise, can you give a minimal example that reproduces the error?