lucidrains / invariant-point-attention

Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module
MIT License
144 stars 9 forks source link

Equivariance test for IPA Transformer #6

Open amrhamedp opened 2 years ago

amrhamedp commented 2 years ago

@lucidrains I would like to ask about the equivariance of the transformer (not IPA blocks). I wonder if you checked for the equivariance of the output when you allow the transformation of local points to global points using the updated quaternions and translations. I am not sure why this test fails in my case.

pengzhangzhi commented 2 years ago

I would like to ask about the equivariance. What does it even mean? Why does it matter?