Coder-Yu / SELFRec

An open-source framework for self-supervised recommender systems.
516 stars 76 forks source link

Questions with reg_loss~ #19

Closed user-AC closed 1 year ago

user-AC commented 1 year ago

Thank you for your excellent work. I noticed slightly different regularization loss in different models. In LightGCN, the total loss is:

batch_loss = bpr_loss(user_emb, pos_item_emb, neg_item_emb) + l2_reg_loss(self.reg, user_emb,pos_item_emb,neg_item_emb)/self.batch_size

In SGL, the total loss is:

batch_loss =  rec_loss + l2_reg_loss(self.reg, user_emb, pos_item_emb,neg_item_emb) + cl_loss

In SimSGL, the total loss is:

batch_loss =  rec_loss + l2_reg_loss(self.reg, user_emb, pos_item_emb) + cl_loss

The l2_reg_loss() of these three losses are different. Is there something I missed? Looking forward to your reply.

Coder-Yu commented 1 year ago

Just a trick for better optimization. You can make them the same term as you wish and see the difference.