qitianwu / DIFFormer

The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained Diffusion"
302 stars 32 forks source link

Cora datasets #11

Closed SeongJinAhn closed 1 year ago

SeongJinAhn commented 1 year ago

I run your codes in node classification on Cora dataset. The performance is not derived as expected. Are there any hyperparameters to be changed in your given code?

These are the derived results when i wrote "python main.py"

Run 05: Highest Train: 14.29 Highest Valid: 31.60 Highest Test: 31.90 Chosen epoch: 498 Final Train: 14.29 Final Test: 31.90 All runs: Highest Train: 14.29 ± 0.00 Highest Test: 31.90 ± 0.00 Highest Valid: 31.60 ± 0.00 Final Train: 14.29 ± 0.00 Final Test: 31.90 ± 0.00

qitianwu commented 1 year ago

The scores are unexpected if you follow the requirement.txt for installing the required package and correctly use the script in run.sh for training on Cora dataset. There is no other hyper-parameter that needs to be changed. Please let me know if you still have problem on this.