However I wonder whether the pre-trained model can be used on downstream work like sequencing classification? I'm currently harvest the power of ESM2 model to do protein annotation and found it hard to fine-tuning the model. The example I can reach only used full parameter fine-tuning and I wish to use Lora/Qlora or freeze the top layers. I wonder where I can get some examples about using these high efficient fine-tuning methods? Like can I use torch method to freeze part of the layers in EMS2 model and train the remaining parameters?
Amazing work!
However I wonder whether the pre-trained model can be used on downstream work like sequencing classification? I'm currently harvest the power of ESM2 model to do protein annotation and found it hard to fine-tuning the model. The example I can reach only used full parameter fine-tuning and I wish to use Lora/Qlora or freeze the top layers. I wonder where I can get some examples about using these high efficient fine-tuning methods? Like can I use torch method to freeze part of the layers in EMS2 model and train the remaining parameters?
Any help is welcomed! Tks.