westlake-repl / SaProt

[ICLR'24 spotlight] Saprot: Protein Language Model with Structural Alphabet
MIT License
271 stars 25 forks source link

Finetuning GPU memory cost #26

Closed byte233 closed 2 months ago

byte233 commented 2 months ago

Hi Sir,

How about the GPU memory cost when finetuning 650M SaProt model. I got OOM error when I try the finetuning script according to the command which the README file provides. My GPU hardware has 24G memory.

LTEnjoy commented 2 months ago

Hi, thank you for your interest in our work!

We are sorry but It seems that 24G memory may be not enough to fine-tune 650M even with batch size set to 1. We recommend you set freeze_backbone=True to only train the classification head or use 35M SaProt as an alternative.

Hope this could resolve your problem :)

byte233 commented 2 months ago

Thanks!