Open katop1234 opened 1 year ago
Hi,
I was looking to implement the RIN architecture using your code. I noticed you applied positional embedding in every block, as opposed to just once in the beginning. What was the reasoning was behind this?
https://github.com/lucidrains/recurrent-interface-network-pytorch/blob/main/rin_pytorch/rin_pytorch.py#L312
And thank you for writing this code!
Hi,
I was looking to implement the RIN architecture using your code. I noticed you applied positional embedding in every block, as opposed to just once in the beginning. What was the reasoning was behind this?
https://github.com/lucidrains/recurrent-interface-network-pytorch/blob/main/rin_pytorch/rin_pytorch.py#L312
And thank you for writing this code!