erksch / fnet-pytorch

Unofficial PyTorch implementation of Google's FNet: Mixing Tokens with Fourier Transforms. With checkpoints.
MIT License
68 stars 7 forks source link

Masking/padding tokens in sequences ? #12

Open d5555 opened 2 years ago

d5555 commented 2 years ago

How can we mask/pad tokens for sequences of varied length ?
When we use fft along dimension (-2) , sequences , if we just use zero padding the result will be skewed. torch.fft.fft(torch.fft.fft(hidden_states.float(), dim=-1), dim=-2).real