issues
search
lucidrains
/
invariant-point-attention
Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorch module
MIT License
147
stars
9
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Normalization of quaternion_update
#10
ShintaroMinami
closed
1 year ago
1
In-place rotation detach not allowed
#9
sidnarayanan
closed
1 year ago
1
Report a bug that causes instability in training
#8
pengzhangzhi
closed
2 years ago
1
Subtle mistake in the implementation
#7
pengzhangzhi
closed
2 years ago
1
Equivariance test for IPA Transformer
#6
amrhamedp
opened
2 years ago
1
Application of Invariant point attention : preserver part of structure.
#5
hw-protein
closed
2 years ago
0
Computing point dist - use cartesian dimension instead of hidden dimension
#4
aced125
closed
2 years ago
3
Enforces FP32 in precision-sensitive part
#3
hypnopump
closed
3 years ago
0
change quaternions update as original alphafold2
#2
ShintaroMinami
closed
3 years ago
1
#126 maybe omit the 'self.point_attn_logits_scale'?
#1
CiaoHe
closed
3 years ago
1