JiaruiFeng / KP-GNN

Source code for how powerful are K-hop message passing graph neural networks (Neurips 2022)
MIT License
62 stars 5 forks source link

How powerful are K-hop message passing graph neural networks

This repository is the official implementation of the model in the How powerful are K-hop message passing graph neural networks

News

In version 4.0, we:

Simulation datasets for validating expressive power

Simulation of regular graph:

# node level
python run_simulation.py --n 20 40 80 160 320 640 1280 --save_appendix=_node --N=10
# graph level
python run_simulation.py --n 20 40 80 160 320 640 1280 --save_appendix=_graph --N=100 --graph

EXP dataset:

# run single model
python train_EXP.py
# search for different K and model
python run_EXP_search.py
# multi-gpu
python run_EXP_search.py --parallel

SR25 dataset:

# run single model
python train_SR.py
# search for different K and model
python run_SR_search.py
# multi-gpu
python run_SR_search.py --parallel

CSL dataset:

# run a single model
python train_CSL.py

# search for different K and model
python run_CSL_search.py
# multi-gpu
python run_CSL_search.py --parallel

Simulation dataset for node/graph properties and substructure

Node/graph properties:

# single task
python train_graph_property.py --task=0
python train_node_property.py --task=0
# run all tasks with a search
python run_graph_node_property.py
# multi-gpu
python run_graph_node_property.py --parallel

Substructure counting:

# single task
python train_structure_counting.py --task=0
# run all tasks with a search
python run_structure_counting.py
#multi-gpu
python run_structure_counting.py --parallel

Real-world datasets

Run MUTAG dataset with 3-hop KP-GCN:

python train_TU.py --dataset_name=MUTAG --model_name=KPGCN --K=3 --kernel=spd

Run TU dataset search:

python run_TU_search.py
#multi-gpu
python run_TU_search.py --parallel

Run QM9 targets:

# single target
python train_qm9.py --task=7
#all targets
python run_qm9_search.py
#multi-gpu
python run_qm9_search.py --parallel

Run ZINC dataset:

python train_ZINC.py --residual --K=8 --model_name=KPGINPlus --num_layer=8 --hidden_size=104
python train_ZINC.py --K=16 --num_layer=17 --hidden_size=96 --residual --model_name=KPGINPrime

Reference

If you find the code useful, please cite our paper:

@inproceedings{
feng2022how,
title={How Powerful are K-hop Message Passing Graph Neural Networks},
author={Jiarui Feng and Yixin Chen and Fuhai Li and Anindya Sarkar and Muhan Zhang},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=nN3aVRQsxGd}
}