Mining Logical Rules with Large Language Models for Knowledge Graph Reasoning with 1 dollar.
Official Implementation of "ChatRule: Mining Logical Rules with Large Language Models for Knowledge Graph Reasoning".
Logical rules are essential for uncovering the logical connections between relations, which could improve reasoning performance and provide interpretable results on knowledge graphs (KGs). In this paper, we propose a novel framework, ChatRule, unleashing the power of large language models (LLMs) for mining logical rules over knowledge graphs with less than 1 dollar. The final rules can be used to conduct reasoning over KGs without additional model training.
pip install -r requirements.txt
Set your OpenAI API key in .env
file
Please check examples of different datasets and LLMs in here.
python path_sampler.py --dataset ${DATASET} --max_path_len 3 --anchor 100 --cores 8
python chat_rule_generator.py --dataset ${DATASET} --model_name gpt-3.5-turbo -f 50 -l 10
python clean_rule.py --dataset ${DATASET} -p gpt-3.5-turbo --model none
python rank_rule.py --dataset ${DATASET} -p clean_rules/${DATASET}/gpt-3.5-turbo-top-0-f-50-l-10/none
python kg_completion.py --dataset ${DATASET} -p ranked_rules/${DATASET}/gpt-3.5-turbo-top-0-f-50-l-10/none/all
python kg_completion.py --dataset family -p FinalRules/family
python kg_completion.py --dataset umls -p FinalRules/umls
python kg_completion.py --dataset wn-18rr -p FinalRules/wn-18rr
python kg_completion.py --dataset yago -p FinalRules/yago
If you found this repo helpful, please help us by citing this paper:
@article{luo2023chatrule,
title={Chatrule: Mining logical rules with large language models for knowledge graph reasoning},
author={Luo, Linhao and Ju, Jiaxin and Xiong, Bo and Li, Yuan-Fang and Haffari, Gholamreza and Pan, Shirui},
journal={arXiv preprint arXiv:2309.01538},
year={2023}
}
The code of KGC reasoning in this work is mainly based on NCRL with a bug in ranking function fixed. We thank the authors for their great works.