geffy / tffm

TensorFlow implementation of an arbitrary order Factorization Machine
MIT License
780 stars 176 forks source link

Use of self.train_w in TFFMCore.init_loss() #40

Closed martincousi closed 6 years ago

martincousi commented 6 years ago

I am trying to understand how tffm is working but I can't figure out why self.loss is obtained by multiplying self.loss_function by self.train_w in the class TFFMCore. I would have thought that self.train_w shouldn't be there...

    def init_loss(self):
        with tf.name_scope('loss') as scope:
            self.loss = self.loss_function(self.outputs, self.train_y) * self.train_w
            self.reduced_loss = tf.reduce_mean(self.loss)
            tf.summary.scalar('loss', self.reduced_loss)
geffy commented 6 years ago

Hi @martincousi, self.train_w is a vector of per-sample importance weights. It's essentially the same thing as sample_weight param in sklearn api

martincousi commented 6 years ago

Ah ok! I confused self.train_w with self.w. Thank you for the quick reply.