entslscheia / GGNN_Reasoning

PyTorch implementation for Graph Gated Neural Network (for Knowledge Graphs)
47 stars 8 forks source link
ggnn knowledge-graph owl2 pytorch

GGNN_Reasoning

This is a Pytorch implementantion of Gated Graph Neural Network (Li, Yujia, et al. "Gated graph sequence neural networks." arXiv preprint arXiv:1511.05493 (2015)). This implementation follows the framework of JamesChuanggg/ggnn.pytorch. The main difference is that my implemantation are more suitable for graph datasets with tremendous edge types such as Knowledge Graphs as it's more memory-efficient. Note that, most other implementations you can find are designed for datasets with only several edge types, such as bAbI dataset.

Though our scenario is using GGNN to approximate the ABox consistency checking problem in OWL2 EL, where each ABox sample can be deemed as a small directed graph and thus the consistency checking can be modeled as a graph-level binary classification problem, the implementation is quite generic.

Requirements:

Python 3.6
PyTorch >=0.4

Usage:

Citation

Please cite the original GGNN paper:

@inproceedings{li2016gated,
  title={Gated Graph Sequence Neural Networks},
  author={Li, Yujia and Zemel, Richard and Brockschmidt, Marc and Tarlow, Daniel},
  booktitle={Proceedings of ICLR'16},
  year={2016}
}

Also, if you find our implementation useful, please also cite the following paper:

@inproceedings{gu2019local,
  title={Local ABox consistency prediction with transparent TBoxes using gated graph neural networks},
  author={Gu, Yu and Pan, Jeff Z and Cheng, Gong and Paulheim, Heiko and Stoilos, Giorgos},
  booktitle={Proc. 14th International Workshop on Neural-Symbolic Learning and Reasoning (NeSy)},
  year={2019}
}