microsoft / Graphormer

Graphormer is a general-purpose deep learning backbone for molecular modeling.
MIT License
2k stars 324 forks source link

About the results on ogbp-pcba in the process of finetune #97

Open zhangdan0602 opened 2 years ago

zhangdan0602 commented 2 years ago

Hi, thanks for your exciting Graphormer. I have a question when I use Graphormer v2 to pretrain on PCQM4Mv2 and finetune on ogbp-pcba or hiv. The evaluated metric ap on pcba in finetune is that '2022-03-13 12:44:53 | INFO | main | ap: 0.02584909547397629'. However, the results of leaderboard is about 0.3. The other condition is that the AUC on hiv in finetune is about 0.3(' | INFO | main | auc: 0.32)' while it is 0.8 on OGB. Thus, I want to know the reason. Maybe the hyper-parameter is not matched or the max-epoch is too small? Thank you very much.

zhengsx commented 2 years ago

Thanks for using Graphormer.

For PCBA, if this feature is urgent for you, please kindly click the thumb up reaction at this https://github.com/microsoft/Graphormer/issues/70, and we will promote the priority.

For Hiv, would you provide your python environment and all instructions to obtained your result? If your instructions are correct, it should be at least 0.8 AUC on Hiv. see #90 .

zhangdan0602 commented 2 years ago

Thank you, I have tried graphormer to pretrain and finetune in v2.0 last week. This week, I try v1.0 to pretrain and finetune on ogbp-molhiv and pcba, and obtain expected results.

LUOyk1999 commented 1 year ago

Hi, thanks for your work. For PCBA, I can only get 'pcqm4mv1_graphormer_base_for_molhiv' for pre-training. However, its num-classes is 1, while the num-classes of PCBA is 128. I would like to ask if the pre-trained model 'pcqm4mv1_graphormer_base_for_molhiv' could train PCBA and how to use it. Thanks very much.