thunlp / JointNRE

Joint Neural Relation Extraction with Text and KGs
MIT License
187 stars 36 forks source link

the loss is calculated twice in cnn? #8

Closed yzho0907 closed 5 years ago

yzho0907 commented 5 years ago

self.loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=self.label,logits=logits)) self.loss = tf.losses.softmax_cross_entropy(onehot_labels = self.label, logits = logits, weights = self.weights) any reason plz? thx

THUCSTHanxu13 commented 5 years ago

The first one is just a loss function adopting cross entropy. The second one is a weighted loss function which can add weights for all instances in a batch during a training. When I implemented my code, I tried to test both of them. There not obvious difference between them, you can choose anyone of them.