In the alphafold2 supplementary, the dropout is nested in the layer norm, which also holds true in the layer norm at transition layer (line 9 in the figure below).
If you think this is a problem, please let me know. I will submit a pr to fix it.
Thanks again for sharing such an amazing repo.
Hi. Thanks for your implementation. It is very helpful. However, I find that you miss the dropout in the IPAModule.
https://github.com/lucidrains/invariant-point-attention/blob/de337568959eb7611ba56eace2f642ca41e26216/invariant_point_attention/invariant_point_attention.py#L239
In the alphafold2 supplementary, the dropout is nested in the layer norm, which also holds true in the layer norm at transition layer (line 9 in the figure below).
If you think this is a problem, please let me know. I will submit a pr to fix it. Thanks again for sharing such an amazing repo.
Best, Zhangzhi Peng