5g4s / paper

0 stars 0 forks source link

LEARNING FROM PROTEIN STRUCTURE WITH GEOMETRIC VECTOR PERCEPTRONS #37

Open 5g4s opened 1 year ago

5g4s commented 1 year ago

https://arxiv.org/abs/2009.01411

5g4s commented 1 year ago

There has yet to emerge a unifying network architecture that simultaneously leverages the geometric and relational aspects of the problem domain.

5g4s commented 1 year ago

To address this gap, we introduce geometric vector perceptrons, which extend standard dense layers to operate on collections of Euclidean vectors. Graph neural networks equipped with such layers are able to perform both geometric and relational reasoning on efficient representations of macromolecules.

5g4s commented 1 year ago

We do so by introducing geometric vector perceptrons (GVPs), a drop-in replacement for standard multi-layer perceptrons (MLPs) in aggregation and feed-forward layers of GNNs. GVPs operate directly on both scalar and geometric features.

5g4s commented 1 year ago

Our method (GVP-GNN) can be applied to any problem where the input domain is a structure of a single macromolecule or of molecules bound to one another.

5g4s commented 1 year ago

The key challenge with this representation, however, is to perform graph propagation in a way that simultaneously preserves the full expressive power of the original GNN while maintaining the rotation invariance provided by the scalar representations.

5g4s commented 1 year ago

the vector and scalar outputs of the GVP are equivariant and invariant, respectively, with respect to an arbitrary composition R of rotations and reflections in 3D Euclidean space. image