divelab / DIG

A library for graph deep learning research
https://diveintographs.readthedocs.io/
GNU General Public License v3.0
1.81k stars 281 forks source link

when running s-mixup in DIG,we found the diffrernt results with the paper published inICML and the difference is large ,why? #230

Closed ZDLTH closed 6 months ago

ZDLTH commented 7 months ago

例如 在IMDB-B的数据集上,跑出来的结果是

image

但是论文上的结果却是

image

连基本的baseline也没跑过,我同时也跑了很多别的数据集发现存在一些相同或者别的问题,请问作者能不能告诉我是不是还需要调什么参数才能跑出论文里的结果呢

hongyiling commented 6 months ago

What vale of alpha are you using? As we shown in Table 7, S-Mixup is sensitive to alpha value.

ZDLTH commented 6 months ago

when we let alpha equal to 1,we get 0.704 on IMDBB, when we let alpha equal to 0.5,we get 0.71 on IMDBB

What vale of alpha are you using? As we shown in Table 7, S-Mixup is sensitive to alpha value.

xuyuankun631 commented 6 months ago

Hey Brother, can I add your WeChat to talk about the details of the operation of S-MixUp?

xuyuankun631 commented 6 months ago

My WeChat ID is 18844094869

hongyiling commented 6 months ago

In my experiments, S-Mixup usually performs better when the alpha value is in the [0.1, 0.3] range. For IMDBB, I use alpha=0.1 and sim_function=abs_diff to get the best results. Please check Appendix B and C in our paper for hyperparameters. Let me know if there are any missing hyperparameters.

Please also make sure GMNET is well-trained, otherwise the results will be bad. If you don't want to train GMNET, you can try to use RRWM (https://pygmtools.readthedocs.io/en/latest/) to replace the computation of the soft alignments (https://github.com/divelab/DIG/blob/dig-stable/dig/auggraph/method/SMixup/smixup.py#L273).

Since ogbg-molhiv dataset has an official split and smaller variance, I also share my checkpoints of GMNET and GCN classifier at https://github.com/divelab/DIG_storage/tree/main/auggraph/SMixup. I hope this can help.

ZDLTH commented 6 months ago

In my experiments, S-Mixup usually performs better when the alpha value is in the [0.1, 0.3] range. For IMDBB, I use alpha=0.1 and sim_function=abs_diff to get the best results. Please check Appendix B and C in our paper for hyperparameters. Let me know if there are any missing hyperparameters.

Please also make sure GMNET is well-trained, otherwise the results will be bad. If you don't want to train GMNET, you can try to use RRWM (https://pygmtools.readthedocs.io/en/latest/) to replace the computation of the soft alignments (https://github.com/divelab/DIG/blob/dig-stable/dig/auggraph/method/SMixup/smixup.py#L273).

Since ogbg-molhiv dataset has an official split and smaller variance, I also share my checkpoints of GMNET and GCN classifier at https://github.com/divelab/DIG_storage/tree/main/auggraph/SMixup. I hope this can help.

Thank you for your reply. My previous results were obtained using the optimal parameters provided in the appendix. Following your suggestion, I used alpha=0.1 and sim_function=abs_diff to run the IMDBB dataset, and the result was 0.7130

hongyiling commented 6 months ago

I just ran GIN on IMDB 3 times using the same hyperparameters and got 0.737, 0.719 and 0.729. It seems that our method is not stable. Thanks for bringing this up, I didn't notice this problem before. I think there are two potential reasons. One reason may be that the dataset is small, so the training of GMNET is not stable. Bad alignments computed by GMNET lead to changes in the results. Another reason could be the limitation we discussed in the paper, namely that the transformation of the graph makes the mixed label imperfect. When the dataset is small, imperfect labels for the mixed data can cause unstable results.