lucidrains / neural-plexer-pytorch

Implementation of Nvidia's NeuralPlexer, for end-to-end differentiable design of functional small-molecules and ligand-binding proteins, in Pytorch
MIT License
51 stars 3 forks source link

What are your thoughts on the pretraining routines? #1

Open amorehead opened 1 year ago

amorehead commented 1 year ago

Hi, @lucidrains.

Having read the NeuralPlexer paper not too long ago, one of the things that stuck out to me about it is how vast the authors' pretraining routine for this model was. Do you have any ideas for how one might replicate this pretraining scheme, preferably by reusing existing code repositories?