shiba24 / learning2rank

Learning to rank with neuralnet - RankNet and ListNet
481 stars 141 forks source link

Runtime Warning and unchanged loss #23

Open Besteverandever opened 4 years ago

Besteverandever commented 4 years ago

Hi I am facing two problems when using ListNet in this code with Letor dataset. Problem 1: My Loss does not seem to be decreasing. Following is the situation in the start.

epoch: 2 NDCG@100 | train: 0.2016394568476937, test: 0.19944033792067814

and these values are same for epoch 200 train mean loss=0.0 test mean loss=0.0 epoch: 201 NDCG@100 | train: 0.2016394568476937, test: 0.19944033792067814

Can you please comment why the loss isnt changing at all?

Problem 2: Chainer maths warnings in log functions. Can you please tell how can I get rid of the following warnings?

Can you tell me how to get rid of Runtime warning related to chainer maths log functions? ..Anaconda3\lib\site-packages\chainer\functions\math\exponential.py:47: RuntimeWarning: divide by zero encountered in log return utils.force_array(numpy.log(x[0])), ..Anaconda3\lib\site-packages\chainer\functions\math\exponential.py:47: RuntimeWarning: invalid value encountered in log return utils.force_array(numpy.log(x[0])), ..Anaconda3\lib\site-packages\chainer\functions\math\basic_math.py:240: RuntimeWarning: invalid value encountered in multiply return utils.force_array(x[0] * x[1]),

Thankyou