Shen-Lab / GraphCL

[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
MIT License
547 stars 103 forks source link

Results about semisupervised_TU PROTEINS experiments #14

Closed OrdinaryCrazy closed 3 years ago

OrdinaryCrazy commented 3 years ago

I follow the instructions in readme, but I can not achieve 0.7417 accuracy in paper on PROTEINS dataset, I only get about 0.73

yyou1996 commented 3 years ago

Hi @OrdinaryCrazy,

Thanks for your interest in our work. (1) Did you perform hyper-parameter tuning as suggested? If not, this might help (see the instruction following from README). (2) Did you try other datasets and compare the performance?

lr in pre-training should be tuned from {0.01, 0.001, 0.0001} and model_epoch in finetuning (this means the epoch checkpoint loaded from pre-trained model) from {20, 40, 60, 80, 100}.

OrdinaryCrazy commented 3 years ago

Thank you, tuning pretraining lr to 0.01 makes the PROTEINS's mean accuracy result come close to 0.74, although there are a lot variance among pretrained models.

ha-lins commented 3 years ago

Hi @OrdinaryCrazy @yyou1996 ,

Could you pls tell me how to run the semi-supervised setting with label rate = 10%, i.e., how to tune the label rate? Thanks!

yyou1996 commented 3 years ago

Hi @ha-lins,

Please see the instruction here https://github.com/Shen-Lab/GraphCL/tree/master/semisupervised_TU#graphcl-with-sampled-augmentations. Just set the flag --semi_split as 10 when finetuning.