Closed CiaoHe closed 1 year ago
Hi Phil! Compared to Algorithm3 Line 4, I found a little inconsistent,
https://github.com/lucidrains/recurrent-interface-network-pytorch/blob/19862012f9f685d7a9dd6c2f88609b34f015dbf2/rin_pytorch/rin_pytorch.py#L310
maybe need
latents = self.latents_attend_to_patches(latents, patches, time = t) + latents latents = self.latents_cross_attn_ff(latents, time=t) + latents
just curious 😁
@CiaoHe Hi He Cao! Yes indeed all attention should be followed by feedforwards, thank you for catching this!
Hi Phil! Compared to Algorithm3 Line 4, I found a little inconsistent,
https://github.com/lucidrains/recurrent-interface-network-pytorch/blob/19862012f9f685d7a9dd6c2f88609b34f015dbf2/rin_pytorch/rin_pytorch.py#L310
maybe need
just curious 😁