cheungdaven / DeepRec

An Open-source Toolkit for Deep Learning based Recommendation with Tensorflow.
GNU General Public License v3.0
1.13k stars 294 forks source link

【请教】Caser损失函数疑惑 #12

Closed caowencai closed 5 years ago

caowencai commented 5 years ago

原文中的损失函数如下: image

与代码中实现的不一样。

对照了BPR的论文和代码,是不是这里的实现是在caser原文似然公式后也加入了参数\theta的先验分布(高斯分布)? image

不知道理解的对不对。

cheungdaven commented 5 years ago

The loss is the same as the code given by the author of caser:

compute the binary cross-entropy loss

            positive_loss = -torch.mean(
                torch.log(torch.sigmoid(targets_prediction)))
            negative_loss = -torch.mean(
                torch.log(1 - torch.sigmoid(negatives_prediction)))
            loss = positive_loss + negative_loss

https://github.com/graytowne/caser_pytorch/blob/master/train_caser.py

caowencai commented 5 years ago

The loss is the same as the code given by the author of caser:

compute the binary cross-entropy loss

positive_loss = -torch.mean( torch.log(torch.sigmoid(targets_prediction))) negative_loss = -torch.mean( torch.log(1 - torch.sigmoid(negatives_prediction))) loss = positive_loss + negative_loss https://github.com/graytowne/caser_pytorch/blob/master/train_caser.py

库里面实现方式是加了正则的,源码如下: self.loss = - tf.reduce_mean(tf.log(tf.sigmoid(self.target_prediction) + 1e-10)) - tf.reduce_mean( tf.log(1 - tf.sigmoid(self.negative_prediction) + 1e-10)) + self.reg_rate * ( tf.nn.l2_loss(self.P) + tf.nn.l2_loss(self.Q) + tf.nn.l2_loss(self.V)+ tf.nn.l2_loss(self.W) + tf.nn.l2_loss( self.b)) + [tf.losses.get_regularization_loss()](https://github.com/cheungdaven/DeepRec/blob/257850eff019ad8ff6deaec7962726367a3b37d0/models/seq_rec/Caser.py#L79)

cheungdaven commented 5 years ago

加正则也是为了防止过拟合。源代码也加了l2正则, 在 self._optimizer = optim.Adam(self._net.parameters(), weight_decay=self._l2, lr=self._learning_rate) 我测试过两个库,在同一个数据集上的表现是一样的。