Shen-Lab / GraphCL

[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
MIT License
541 stars 103 forks source link

semisupervised learning dataset #48

Closed Bunnyqiqi closed 2 years ago

Bunnyqiqi commented 2 years ago

Hi~ here are several questions about semi-supervised learning

For MUTAG dataset as instance: (1)As for the semi-supervised result reported in the paper, which dataset did you use?Did you use MUTAG deg+odeg100+ak3+recall or deg+odeg100+ak3+reall?And what does the dataset name means?What are the differences between these two datasets in deed? (2)Why not use the same datasets with unsupervised experiements? What are the differeces between these datasets used in semi-supervised learning and supervised learning ? Can I replace the dataset used in semi-supervised directly by the datasets used in unsupervised learning? Can I get same results? (3)In semi-supervised learning,I get several pretraining weights after pretraining ,Why it still takes very long time when I finetuing ?It ‘s so weried. It should be very quick for finetuing. Where should I modify to get a quick finetuning process? (4) (4)In semi-supervised learning ,which model did you use to pretrain ,GFN or ResGCN to get the result report in the paper?Why not use GIN as same as in unsupervised learning?

Thanks a lot!! I'm very puzzled!

yyou1996 commented 2 years ago

Hi @Bunnyqiqi,

Thank you for the questions.

(1) The code will take care of the data processing. We follow the backbone pipeline in https://github.com/chentingpc/gfn. (2) We build the model on top of the corresponding SOTA settings, while I believe datasets in TU of different versions would differ minorly. (3) Yes you are definitely welcome to optimize the code for a faster run. (4) Please refer (2).