Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
My small grain on sand for this project ;) : at least don't deal with python appends which are 2x slower than list comprehension.
If this is of any help, here are some considerations (there might be misunderstandings on my side due to the decorators and so on):
i have my reservations on the utility of the line 62 in utils.py
would't make more sense to start the for i modified (line 122 of utils.py) in reverse order, then use the cached calculations for the lpmv() ?
same case (loop in reverse order) for the for in line 148 of basis.py?
if using the scipy.special.poch (which can deal with np arrays) instead of the custom pochhammer implementation, all operations inside get_spherical_harmonics_element are vectorizeable but the lpmv function call.
My sense is that the lpmv, get_spherical_harmonics_element and get_spherical_harmonics could be all wrapped in a single function (lower reusability / extension... so maybe doing the inverse loop order and caching is enough).
My small grain on sand for this project ;) : at least don't deal with python appends which are 2x slower than list comprehension.
If this is of any help, here are some considerations (there might be misunderstandings on my side due to the decorators and so on):
utils.py
for
i modified (line 122 ofutils.py
) in reverse order, then use the cached calculations for thelpmv()
?for
in line 148 ofbasis.py
?scipy.special.poch
(which can deal with np arrays) instead of the custompochhammer
implementation, all operations insideget_spherical_harmonics_element
are vectorizeable but thelpmv
function call.lpmv
,get_spherical_harmonics_element
andget_spherical_harmonics
could be all wrapped in a single function (lower reusability / extension... so maybe doing the inverse loop order and caching is enough).