lucidrains / En-transformer

Implementation of E(n)-Transformer, which incorporates attention mechanisms into Welling's E(n)-Equivariant Graph Neural Network
MIT License
208 stars 28 forks source link

Updates Rotary Dep #2

Closed hypnopump closed 1 year ago

hypnopump commented 3 years ago

Imports from package instead of repeating code

hypnopump commented 3 years ago

Same could be done in AF2 btw

hypnopump commented 3 years ago

Now also allows to do a linear projection of edges and features before passing them to the transformer.

lucidrains commented 1 year ago

closing since moving away from rotary embeddings for extrapolation purposes