Open pmichel31415 opened 6 years ago
After investigating a bit on this I can make an educated guess on the reason: There is a possibility that the attention weights returned by the transformerDecoderLayer are not masked properly.
The error goes away when I clamp the alignment dictionary to the source sentence length but this is just a bandaid.
@pmichel31415 @jhcross Is this still an issue?
It should be fixed by #226 which hasn't been merged yet
I get the following error when decoding using the transformer with
--replace-unk
: