Closed zhenyuhe00 closed 1 year ago
I'd expect only a very minor performance drop. If you use fixed weights to make the weighted sum you can avoid additional memory consumption by just summing into a result tensor throughout the network.
Thanks!
I'd expect only a very minor performance drop. If you use fixed weights to make the weighted sum you can avoid additional memory consumption by just summing into a result tensor throughout the network.
I wonder How to get the weights, are they trained when training ESM-Fold, or just manually set to a number? Thanks in advance!
will be released with ESMFold
Excited to share that ESMFold was released on November 1st!
Could you tell us the key under which the layer weights are stored in esmfold_3B_v1.pt ? thank you!
Hi, Congrats on this series of great work! I'm using your pre-trained model for downstream applications. In the ESM-2 paper, the weighted sum of embeddings from all layers of ESM-2 is fed to the ESM-fold. I wonder if it's enough to only use the embeddings from the last layer of ESM-2, since storing all layers' embedding budget is high. Will the performance drop?
Thanks in advance!