This commit introduces some padding to the repetition of the encoded word so that it is possible to generate words of lengths that aren't a factor off the width of the VGG output.
The encoded word is repeated as many times as possible and afterwards padded with the embedded PAD_TOKEN so that it matches this dimension.
This commit introduces some padding to the repetition of the encoded word so that it is possible to generate words of lengths that aren't a factor off the width of the VGG output. The encoded word is repeated as many times as possible and afterwards padded with the embedded
PAD_TOKEN
so that it matches this dimension.