Shuriken13 / ENRL

9 stars 3 forks source link

ENRL

This is the implementation for Explainable Neural Rule Learning. In Proceedings of The ACM Web Conference 2022 (TheWebConf ’22).

Environments

./requirements.txt - The codes can be successfully run with following packages in an Anaconda environment:

tqdm
treelib
scipy
torchmetrics==0.3.2
py3nvml
pytorch-lightning==1.3.1
numpy
pytorch==1.7.1
pandas
jsonlines
matplotlib
PyYAML
scikit-learn

Other settings with pytorch>=1.3.1 may also work.

Datasets

The processed datasets can be downloaded from Tsinghua Cloud or Google Drive.

You should place the datasets in the ./dataset/. The tree structure of directories should look like:

.
├── dataset
│   ├── Adult
│   ├── Credit
│   ├── RSC2017
│   └── Synthetic
├── preprocess
├── enrl
├── main.py
└── predict.py

Examples to run the code

# ENRL on Synthetic dataset
> cd ENRL/
> python main.py --model_name ENRL --dataset Synthetic --rule_len 5 --rule_n 40 --es_patience 200 --op_loss 1 --cuda 0

Cite

If you find this repository useful for your research or development, please cite the following paper:


@inproceedings{10.1145/3485447.3512023,
    author = {Shi, Shaoyun and Xie, Yuexiang and Wang, Zhen and Ding, Bolin and Li, Yaliang and Zhang, Min},
    title = {Explainable Neural Rule Learning},
    year = {2022},
    isbn = {9781450390965},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3485447.3512023},
    doi = {10.1145/3485447.3512023},
    booktitle = {Proceedings of the ACM Web Conference 2022},
    pages = {3031–3041},
    numpages = {11},
    keywords = {explainable neural networks, rule learning, out of distribution},
    location = {Virtual Event, Lyon, France},
    series = {WWW '22}
}