File "./transformer/Encoder.py", line 78, in init
self.pemb = PositionalEmb(isize, xseql, 0, 0)
File "./transformer/modules.py", line 46, in init
self.reset_parameters()
File "./transformer/modules.py", line 68, in reset_parameters
self.w[:, 0::2], self.w[:, 1::2] = torch.sin(pos rdiv_term), torch.cos(pos rdiv_term)
RuntimeError: The expanded size of the tensor (70) must match the existing size (71) at non-singleton dimension 1. Target sizes: [512, 70]. Tensor sizes: [512, 71]
@StalVars we observed this issue very early as in the comment, several other implementations just ignored this, but it should be fixed now in the master branch.
Currently only accepts isize to be even number, below isize=141 and it failed(probably don't know how to split at self.w[:,0::2] and self.w[:,1::2] ?
File "./transformer/Encoder.py", line 78, in init self.pemb = PositionalEmb(isize, xseql, 0, 0) File "./transformer/modules.py", line 46, in init self.reset_parameters() File "./transformer/modules.py", line 68, in reset_parameters self.w[:, 0::2], self.w[:, 1::2] = torch.sin(pos rdiv_term), torch.cos(pos rdiv_term) RuntimeError: The expanded size of the tensor (70) must match the existing size (71) at non-singleton dimension 1. Target sizes: [512, 70]. Tensor sizes: [512, 71]