xiangwang1223 / neural_graph_collaborative_filtering

Neural Graph Collaborative Filtering, SIGIR2019
MIT License
781 stars 261 forks source link

How to solve loss is NAN? #17

Closed Jhy1993 closed 4 years ago

Jhy1993 commented 4 years ago

Hi Thanks for your brilliant code. However, I run your code several times and always meet "loss is NAN". So how can I avoid this phenomenon?

xiangwang1223 commented 4 years ago

Hi, Thanks for your interest. According to my experience, please try to make the learning rate and l2_normalization smaller, say 10e-5. Thanks.

xiangwang1223 commented 4 years ago

Hi, I learn from my peers. Maybe you can try the following code to modify the BPR loss, in order to avoid NAN:

loss = tf.reduce_sum(tf.nn.softplus(-(pos_result - neg_result)))

Please let me know whether it works. Thanks.

Jhy1993 commented 4 years ago

It works! Thanks for your advice.