facebookresearch / esm

Evolutionary Scale Modeling (esm): Pretrained language models for proteins
MIT License
3.16k stars 627 forks source link

About downstream tasks #136

Closed Yijia-Xiao closed 2 years ago

Yijia-Xiao commented 2 years ago

Hi, I have two questions regarding the details of downstream tasks (section 4.2. Supervised Contact Prediction & section 4.3. Secondary Structure Prediction) in MSA Transformer For supervised contact prediction, the paper mentioned we train a deep residual network. I'm wondering whether the MSA transformer model is finetuned while training the downstream deep residual network. For secondary structure prediction, the MSA paper mentioned we train a state-of-the-art downstream head based on the Netsurf architecture (Klausenet al., 2019). The downstream model is trained to predict 8-class secondary structure from the pretrained representations; then I referred to the paper of Netsurf, which has the architecture description as given below split line.

I am wondering how MSA transformer incorporate its representation into the downstream head based on the Netsurf architecture, and whether HMM profiles is used in SS8 prediction? I would appreciate it if you could provide more details about the structure used in SS8.

Thank you!

-- split line image

tomsercu commented 2 years ago

Thanks for getting in touch!

whether MSA transformer model is finetuned while training the downstream deep residual network

No, we don't finetune MSA Transformer here, the model is kept frozen.

how MSA transformer incorporate its representation into the downstream head based on the Netsurf architecture

we extract sequence representations corresponding to the main sequence (query sequence of the MSA) only. so if emb is [M x L x d] we take emb[0]. Another logical alternative, which performed similar but a little worse, is to average over the msa so use emb.mean(0). No feature combination done here with the HMM features.

Yijia-Xiao commented 2 years ago

Got it! I will try to figure out how to use MSA transformer to do the tasks mentioned above. Thank you for your timely reply 👍

tomsercu commented 2 years ago

you're welcome! Let me close the issue for now but please reopen if you face any issues.

Yijia-Xiao commented 2 years ago

Sure! Thank you!