Closed user-AC closed 1 year ago
Thank you for your excellent work. I noticed slightly different regularization loss in different models. In LightGCN, the total loss is:
batch_loss = bpr_loss(user_emb, pos_item_emb, neg_item_emb) + l2_reg_loss(self.reg, user_emb,pos_item_emb,neg_item_emb)/self.batch_size
In SGL, the total loss is:
batch_loss = rec_loss + l2_reg_loss(self.reg, user_emb, pos_item_emb,neg_item_emb) + cl_loss
In SimSGL, the total loss is:
batch_loss = rec_loss + l2_reg_loss(self.reg, user_emb, pos_item_emb) + cl_loss
The l2_reg_loss() of these three losses are different. Is there something I missed? Looking forward to your reply.
Just a trick for better optimization. You can make them the same term as you wish and see the difference.
Thank you for your excellent work. I noticed slightly different regularization loss in different models. In LightGCN, the total loss is:
In SGL, the total loss is:
In SimSGL, the total loss is:
The l2_reg_loss() of these three losses are different. Is there something I missed? Looking forward to your reply.