When I debug it, the shape of src is (37, 32), in which 32 is the batch size.
However, when I read the explanation of nn.Embedding, the example code shows:
>>> # a batch of 2 samples of 4 indices each
>>> input = torch.LongTensor([[1,2,4,5],[4,3,2,9]])
>>> embedding(input)
Thus, the input of Embedding should be (batch size, maxLen).
Thank you for sharing this project code, and I have a question for nn.Embedding.
In this project, the shape of
src
andtrg
is (maxLen, batch size). The forward of Encoder is:When I debug it, the shape of
src
is (37, 32), in which 32 is the batch size. However, when I read the explanation of nn.Embedding, the example code shows:Thus, the input of Embedding should be (batch size, maxLen).
This problem make me very confuzed.
Any suggestion is apprciated!