sailab-code / gnn

Graph Neural Network
Other
39 stars 25 forks source link

The loss function of Net_Subgraph #7

Closed lihuiliullh closed 5 years ago

lihuiliullh commented 5 years ago

Dear Authors:

May I know what is the meaning of the loss function in Net_Subgraph? It is a little difficult for me to understand.

output = tf.maximum(output, self.EPSILON, name="Avoiding_explosions") # to avoid explosions xent = -tf.reduce_sum(target * tf.log(output), 1) lo = tf.reduce_mean(xent)

Best.

mtiezzi commented 5 years ago

Hi. It is an implementation of the common softmax cross-entropy loss function. However, because of the output are already probabilities (softmax), we cannot use the xent = tf.nn.softmax_cross_entropy_with_logits(logits, labels) method of Tensorflow. We reimplemented it, avoiding numerical instability with the first line: (output = tf.maximum(output, self.EPSILON, name="Avoiding_explosions") - it avoids to have zero values in the output and hence compute the softmax with them, resulting in Nan values.

Here you can find a reference for the cross-entropy reimplementation