This repository is the Pytorch implementation of the paper:
Yunjun Gao, Yuntao Du, Yujia Hu, Lu Chen, Xinjun Zhu, Ziquan Fang and Baihua Zheng. (2022). Self-Guided Learning to Denoise for Robust Recommendation. Paper in ACM DL or Paper in arXiv. In SIGIR'22, Madrid, Spain, July 11-15, 2022.
Self-Guided Denoising Learning (SGDL) is a new denoising paradigm which is able to collect memorized interactions at the early stage of the training, and leverage those data as denoising signals to guide the following training of the model in a meta-learning manner. Besides, SGDL can automatically switch its learning phase at the memorization point from memorization to self-guided learning, and select clean and informative memorized data via an adaptive denoising scheduler to further improve the robustness.
If you want to use our codes and datasets in your research, please cite:
@inproceedings{SGDL22,
title = {Self-Guided Learning to Denoise for Robust Recommendation},
author = {Yunjun Gao and
Yuntao Du and
Yujia Hu and
Lu Chen and
Xinjun Zhu and
Ziquan Fang and
Baihua Zheng},
pages = {1412--1422},
booktitle = {{SIGIR}},
year = {2022}
}
Three popular public datasets for recommendation are used in our research:
To demonstrate the reproducibility of the best performance reported in our paper and facilitate researchers to track whether the model status is consistent with ours, we provide the best parameter settings (might be different for the customized datasets) in the scripts, and provide the log for our trainings.
python main.py --batch_size 128 --lr 0.0005 --meta_lr 0.0005 --model lgn --eval_freq 5 --stop_step 8 --dataset ml100k
python main.py --batch_size 1024 --lr 0.0005 --meta_lr 0.0005 --model lgn --eval_freq 10 --stop_step 4 --dataset adressa
python main.py --batch_size 2048 --lr 0.0005 --meta_lr 0.0005 --model lgn --eval_freq 10 --stop_step 4 --dataset yelp