lucidrains / perceiver-pytorch

Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
MIT License
1.1k stars 134 forks source link

Starting point of position encoding #29

Closed ohwi closed 3 years ago

ohwi commented 3 years ago

Hi. First of all, thank you for sharing the code.

As far as I understand, the code line below is evenly spacing the exponent.

https://github.com/lucidrains/perceiver-pytorch/blob/27a69f6c75939b0ebb1b0655695a54bb09289aca/perceiver_pytorch/perceiver_pytorch.py#L35

If I've understood the code correctly, I think 1 need to be 0 since log 1 = 0, as like max_freq/2.

lucidrains commented 3 years ago

@ohwi ohh, you may be right

lucidrains commented 3 years ago

@ohwi should be fixed in 0.2.1!