lucidrains / enformer-pytorch

Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
MIT License
434 stars 81 forks source link

Problems during fine-tuing on my own ATAC-seq data #37

Open Aut-eve opened 9 months ago

Aut-eve commented 9 months ago

Hi, I'm now trying to fine-tune the pretrained model on my own ATAC-seq data using HeadAdapterWrapper. Due to device limitations and long training time, I set finetune_last_n_layers_only to 2, and use Adam as optimizer. I use CosineAnnealingLR as learning rate scheduler, and set my learning rate from 1e-3 to 1e-5. However, the result seems not good, so may I have your learning rate, optimizer and other hyper-parameters? Thanks very much.