hfxunlp / transformer

Neutron: A pytorch based implementation of Transformer and its variants.
https://github.com/hfxunlp/transformer
GNU General Public License v3.0
63 stars 9 forks source link

Can't give odd number embedding size for isize #1

Closed StalVars closed 5 years ago

StalVars commented 5 years ago

Currently only accepts isize to be even number, below isize=141 and it failed(probably don't know how to split at self.w[:,0::2] and self.w[:,1::2] ?

self.encoder = Encoder(isize=isize, num_layer=6,nwd=vocab_size)

File "./transformer/Encoder.py", line 78, in init self.pemb = PositionalEmb(isize, xseql, 0, 0) File "./transformer/modules.py", line 46, in init self.reset_parameters() File "./transformer/modules.py", line 68, in reset_parameters self.w[:, 0::2], self.w[:, 1::2] = torch.sin(pos rdiv_term), torch.cos(pos rdiv_term) RuntimeError: The expanded size of the tensor (70) must match the existing size (71) at non-singleton dimension 1. Target sizes: [512, 70]. Tensor sizes: [512, 71]

hfxunlp commented 5 years ago

@StalVars we observed this issue very early as in the comment, several other implementations just ignored this, but it should be fixed now in the master branch.