hadyelsahar / CNN-RelationExtraction

Convolution neural network for relation extraction between two given entities
178 stars 60 forks source link

[Potential NAN bug] Loss may become NAN during training #8

Open Justobe opened 3 years ago

Justobe commented 3 years ago

Hello~

Thank you very much for sharing the code!

I try to use my own dataset ( with the same shape as mnist) in code. After some iterations, it is found that the training loss became NAN. After carefully checking the code, I found that the following code may trigger NAN in loss:

In CNN-RelationExtraction/CNN.py: line 104

cross_entropy = -tf.reduce_sum(self.y_ * tf.log(self.y_conv))

If y_conv contains 0 (output of softmax ), the result of tf.log(y_conv) is inf because log(0) is illegal . And this may cause the result of loss to become NAN.

It could be fixed by making the following changes:

cross_entropy = -tf.reduce_sum(self.y_ * tf.log(self.y_conv + 1e-8))

or

cross_entropy = -tf.reduce_sum(self.y_ * tf.log(tf.clip_by_value(self.y_conv,1e-8,1.0)))

Hope to hear from you ~

Thanks in advance! : )