yuyangw / MolCLR

Implementation of MolCLR: "Molecular Contrastive Learning of Representations via Graph Neural Networks" in PyG.
MIT License
233 stars 57 forks source link

Some questions about finetune #23

Open Shimmer8001 opened 1 year ago

Shimmer8001 commented 1 year ago

Recently I came across some papers on molecular contrastive learning, and it is my great pleasure to find a paper written by your team, named Molecular Contrastive Learning of Representations via Graph Neural Networks. This paper has benefited me a lot. But when I use the pre-trained model you provided for downstream tasks with the default configuration file config_finetune.yaml, the performance of the model can never reach the one shown in the paper. So I would like to ask if you can provide the hyperparameter configuration files required for downstream tasks on each data set.

danielkaifeng commented 1 year ago

hi, @Shimmer8001 I finetuned and found the result similar to the paper, with just minor decrease. I think there are some random seed need to set to replicate exactly the same results.

Besides, did you try to pre-train model on larger of other different dataset to improve the finetune?

zhangtia16 commented 6 months ago

Same here, I find that I cannot reproduce the results shown in the paper on many datasets.

happyCoderZC commented 1 month ago

hi, @danielkaifeng I can't reproduce the results shown in the paper, too. I guess maybe the hyperparameters and random seed I set are not suitable. I wonder if you can share the hyperparameters and random seed. Thank you very much.