songlab-cal / tape-neurips2019

Tasks Assessing Protein Embeddings (TAPE), a set of five biologically relevant semi-supervised learning tasks spread across different domains of protein biology. (DEPRECATED)
https://arxiv.org/abs/1906.08230
MIT License
118 stars 34 forks source link

Freeze Weights #22

Open spark157 opened 4 years ago

spark157 commented 4 years ago

Hello,

I can see from the Training Details in the paper that during supervised fine-tuning backpropagation was through the entire model including the language model portion. I also see from the code that you had some functionality for freezing weights. I was curious what magnitude you saw between freezing or training the language model portion during the supervised fine-tuning if you did that, especially for the Transformer.

Thanks again!

Scott

rmrao commented 4 years ago

We did not test this thoroughly for every downstream task, but for secondary structure we generally saw 1-2 percentage points of improvement when fine-tuning the whole model. I suspect the difference will depend a great deal on the task.

On Jun 12, 2020, at 6:40 AM, spark157 notifications@github.com wrote:

Hello,

I can see from the Training Details in the paper that during supervised fine-tuning backpropagation was through the entire model including the language model portion. I also see from the code that you had some functionality for freezing weights. I was curious what magnitude you saw between freezing or training the language model portion during the supervised fine-tuning if you did that, especially for the Transformer.

Thanks again!

Scott

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/songlab-cal/tape-neurips2019/issues/22, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABRSCXKLIJD5DRSX7OSJ7ADRWIV4JANCNFSM4N4K2YDQ.