hw-du / CBiT

Implementation of the paper "Contrastive Learning with Bidirectional Transformers for Sequential Recommendation".
GNU General Public License v3.0
26 stars 6 forks source link

Reproduce the results of BERT4Rec and BERT4RecS #3

Closed Aidenzich closed 1 year ago

Aidenzich commented 1 year ago

Hello, thank you for your masterpiece. However, I would like to ask how to reproduce the results of BERT4Rec and BERT4RecS in the paper. I used the same parameters and SlidingWindow technique as in the paper, but the highest NDCG@10 scores I obtained were 0.02884 and 0.0358 respectively, which are quite different from the paper's results of 0.0306 and 0.043. I don't think this difference is due to the random seed.

If you know how to solve this problem, please kindly advise. Thank you.

hw-du commented 1 year ago

Hi, if you set num_positive=1, disable contrastive learning, then it's BERT4RecS. if you set num_positive=1, disable contrastive learning, disable slidewindow, then it's BERT4rec.

Aidenzich commented 1 year ago

Thanks for the help! I think the issue is due to the difference of the validation dataset.