premgopalan / collabtm

Collaborative Topic Modeling. P. Gopalan, L. Charlin, D.M. Blei, Content-based recommendations with Poisson factorization, NIPS 2014.
GNU General Public License v3.0
59 stars 32 forks source link

Training likelihood decreasing with iterations #3

Open david-cortes opened 6 years ago

david-cortes commented 6 years ago

If I pass a validation file which is a copy of the train file and look at the numbers in validation.txt, I see that at some point the log-likelihood starts decreasing (moving away from zero) rather than increasing.

I’m not an expert in variational inference, but, if doing full-batch updates in which each parameter is set to its expected value given the other variables, shouldn’t the training likelihood be monotonically increasing with respect to the number of iterations?

The dataset is under this link: https://drive.google.com/open?id=1FzBzQnGU3bQ3ojLIGy9Hby9A6Tcun-JQ

Called with the following parameters: collabtm -dir path_to_data -nusers 191770 -ndocs 119448 -nvocab 342260 -k 100

Log-likelihood shows a decrease at iterations 60 and 70, after which it stops.

0   121 -14.392807046   434084
10  1331    -13.920642836   434084
20  2543    -12.258906021   434084
30  3767    -12.187407095   434084
40  4989    -12.173852715   434084
50  6210    -12.170551069   434084
60  7458    -12.172230009   434084
70  8680    -12.180070428   434084