PyTorch implementation for XRec: Large Language Models for Explainable Recommendation
XRec: Large Language Models for Explainable Recommendation\ Qiyao Ma, Xubin Ren, Chao Huang\ Preprint 2024*
This paper presents a model-agnostic framework, XRec, that integrates the graph-based collaborative filtering framework with Large Language Models (LLMs) to generate comprehensive explanations for recommendations. By leveraging the inherent collaborative user-item relationships and harnessing the powerful textual generation capabilities of LLMs, XRec establishes a strong connection between collaborative signals and language semantics through the utilization of a Mixture of Experts (MoE) adapter.
Run the following command to install dependencies:
pip install -r requirements.txt
We utilize three public datasets: Amazon-books amazon
, Google-reviews google
, Yelp yelp
. To generate user/item profile and explanations from scratch, enter your OpenAI API Key in line 7 of these files: generation/{item_profile/user_profile/explanation}/generate_{profile/exp}.py
.
python generation/item_profile/generate_profile.py
python generation/user_profile/generate_profile.py
python generation/explanation/generate_exp.py
Each of the below commands can be run independently, since the finetuned LLM and generated explanations are provided within the data. Prepare your Hugging Face User Access Token for downloading Llama 2 model.
python explainer/main.py --mode finetune --dataset {dataset}
python explainer/main.py --mode generate --dataset {dataset}
python explainer/sample.py --dataset {dataset}
python evaluation/main.py --dataset {dataset}
Supported datasets: amazon
, google
, yelp
Below is an example of generating explanation for a specific user-item recommendation using the yelp
dataset.
MD Oriental Market, is summarized to attract Fans of Asian cuisine, individuals looking for a variety of Asian products, and those seeking unique and ethnic food items would enjoy MD Oriental Market. Customers interested in a well-organized, spacious, and clean grocery store with a diverse selection of Asian ingredients and products would also appreciate this location.
This user is likely to enjoy casual American comfort food, barbecue with various meat options and tasty sauces, high-quality dining experiences with tasting menus, and authentic Italian food and beverages in cozy atmospheres.
The user would enjoy this business for its vast selection of Asian ingredients, including fresh produce, sauces, condiments, and spices, making it a go-to for authentic and diverse cooking options.
├── README.md
├── data (amazon/google/yelp)
│ ├── data.json # user/item profile with explanation
│ ├── trn/val/tst.pkl # separation of data.json
│ ├── total_trn/val/tst.csv # user-item interactions
│ ├── user/item_emb.pkl # user/item embeddings
│ ├── user/item_converter.pkl # MoE adapter
│ ├── tst_pred.pkl # generated explanation
│ └── tst_ref.pkl # ground truth explanation
├── encoder
│ ├── models # GNN structure
│ ├── utils
│ └── train_encoder.py # derive user/item embeddings
├── explainer
│ ├── models
│ │ ├── explainer.py # XRec model
│ │ └── modeling_explainer.py # modified PyTorch LLaMA model
│ ├── utils
│ ├── main.py # employ XRec
│ └── sample.py # see samples of generated explanations
├── generation
│ ├── instructions # system prompts for user/item profile and
│ ├── explanations
│ ├── item_profile # generate item profile
│ │ ├── item_prompts.json
│ │ ├── item_system_prompt.json
│ │ └── generate_profile.py
│ ├── user_profile # generate user profile
│ │ ├── user_prompts.json
│ │ ├── user_system_prompt.json
│ │ └── generate_profile.py
│ └── explanation # generate ground truth explanation
│ ├── exp_prompts.json
│ ├── exp_system_prompts.json
│ └── generate_exp.py
└── evaluation
├── main.py
├── metrics.py
└── system_prompt.txt # system prompt for GPTScore
If you find XRec helpful to your research or applications, please kindly cite:
@article{ma2024xrec,
title={XRec: Large Language Models for Explainable Recommendation},
author={Ma, Qiyao and Ren, Xubin and Huang, Chao},
journal={arXiv preprint arXiv:2406.02377},
year={2024}
}