seukgcode / IterDE

[AAAI 2023] IterDE: An Iterative Knowledge Distillation Framework for Knowledge Graph Embeddings
10 stars 1 forks source link

IterDE

The codes and datasets for "IterDE: An Iterative Knowledge Distillation Framework for Knowledge Graph Embeddings". (AAAI2023)

The repo is expended on the basics of OpenKE.

Framework

Folder Structure

The structure of the folder is shown below:

IterDE
├─checkpoint
├─benchmarks
├─IterDE_FB15K237
├─IterDE_WN18RR
├─openke
├─requirements.txt
└README.md

Introduction to the structure of the folder:

Requirements

All experiments are implemented on CPU Intel(R) Xeon(R) Silver 4210 CPU @ 2.20GHz and GPU GeForce RTX 2080 Ti. The version of Python is 3.7.

Please run as follows to install all the dependencies:

pip3 install -r requirements.txt

Usage

Preparation

  1. Enter openke folder.
cd IterDE
cd openke
  1. Compile C++ files
bash make.sh
cd ../

Example 1: Distill TransE on FB15K-237:

  1. Firstly, we pre-train the teacher model TransE:
cp IterDE_FB15K237/transe_512.py ./
python transe_512.py
  1. Then we iteratively distill the student model:
cp IterDE_FB15K237/transe_512_256_new.py ./
cp IterDE_FB15K237/transe_512_256_128_new.py ./
cp IterDE_FB15K237/transe_512_256_128_64_new.py ./
cp IterDE_FB15K237/transe_512_256_128_64_32_new.py ./
python transe_512_256_new.py
python transe_512_256_128_new.py
python transe_512_256_128_64_new.py
python transe_512_256_128_64_32_new.py
  1. Finally, the distilled student will be generated in the checkpoint folder.

Example 2: Distill ComplEx on WN18RR:

  1. Firstly, we pre-train the teacher model ComplEx:
cp IterDE_WN18RR/com_wn_512.py ./
python com_wn_512.py
  1. Then we iteratively distill the student model:
cp IterDE_WN18RR/com_512_256_new.py ./
cp IterDE_WN18RR/com_512_256_128_new.py ./
cp IterDE_WN18RR/com_512_256_128_64_new.py ./
cp IterDE_WN18RR/com_512_256_128_64_32_new.py ./
python com_512_256_new.py
python com_512_256_128_new.py
python com_512_256_128_64_new.py
python com_512_256_128_64_32_new.py
  1. Finally, the distilled student will be generated in the checkpoint folder.

Acknowledgement:

We refer to the code of OpenKE. Thanks for their contributions.

Citation:

If you find the repository helpful, please cite the following paper

@inproceedings{liu2023iterde,
  title={IterDE: an iterative knowledge distillation framework for knowledge graph embeddings},
  author={Liu, Jiajun and Wang, Peng and Shang, Ziyu and Wu, Chenxiao},
  booktitle={AAAI},
  year={2023}
}