seeyourmind / TKGElib

MIT License
10 stars 3 forks source link

ECEformer

Transformer-based Reasoning for Learning Evolutionary Chain of Events on Temporal Knowledge Graph

ECEformer is a novel Transformer-based reasoning model for TKG to learn the Evolutionary Chain of Events. It will appear in SIGIR 2024 (arXiv version).

Installation

The repo requires python>=3.7, anaconda and a new env is recommended.

conda create -n eceformer python=3.7 -y # optional
conda activate eceformer # optional
git clone git@github.com:seeyourmind/TKGElib.git
cd ECEformer
pip install -e .

Data

First download the standard benchmark datasets. The Data folder can be downloaded from GDELT & ICEWS14/05-15, ICEWS18, YAGO11k & WikiData12k. Then process the dataset using the commands below.

cd data
# for GDELT/ICEWS14/ICEWS05-15/ICEWS18
# e.g. python preprocess.py icews14
python preprocess.py $dataset_name
# for YAGO11k and WikiData12k
python preprocess_intravel.py $dataset_name

Training

Configurations for the experiments are in the /config folder.

python -m kge start config/gdelt-best.yaml

The training process uses DataParallel in all visible GPUs by default, which can be overrode by appending --job.device cpu to the command above.

Evaluation

You can evaluate the trained models on dev/test set using the following commands.

python -m kge eval <saved_dir>
python -m kge test <saved_dir>

Acknowledgment

Thanks LibKGE and HittER for providing the preprocessing scripts and the base frameworks.

Citation

@inproceedings{fang-sigir-2024-eceformer,
    title = "Temporal Knowledge Graph Completion, Context Information Mining, Link Prediction, Evolutionary Chain of Event",
    author = "Fang, Zhiyu and Lei, Shuai-Long and Zhu, Xiaobin and Yang, Chun and Zhang, Shi-Xue and Yin, Xu-Cheng and Qin, Jingyan",
    booktitle = "Proceedings of The 47th International ACM SIGIR Conference",
    year = "2024"
}