lucidrains / se3-transformer-pytorch

Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
MIT License
262 stars 23 forks source link

faster loop #8

Closed hypnopump closed 3 years ago

hypnopump commented 3 years ago

My small grain on sand for this project ;) : at least don't deal with python appends which are 2x slower than list comprehension.

If this is of any help, here are some considerations (there might be misunderstandings on my side due to the decorators and so on):

lucidrains commented 3 years ago

i'll take this microoptimization :D