If I pass a validation file which is a copy of the train file and look at the numbers in validation.txt, I see that at some point the log-likelihood starts decreasing (moving away from zero) rather than increasing.
I’m not an expert in variational inference, but, if doing full-batch updates in which each parameter is set to its expected value given the other variables, shouldn’t the training likelihood be monotonically increasing with respect to the number of iterations?
If I pass a validation file which is a copy of the train file and look at the numbers in
validation.txt
, I see that at some point the log-likelihood starts decreasing (moving away from zero) rather than increasing.I’m not an expert in variational inference, but, if doing full-batch updates in which each parameter is set to its expected value given the other variables, shouldn’t the training likelihood be monotonically increasing with respect to the number of iterations?
The dataset is under this link: https://drive.google.com/open?id=1FzBzQnGU3bQ3ojLIGy9Hby9A6Tcun-JQ
Called with the following parameters: collabtm -dir path_to_data -nusers 191770 -ndocs 119448 -nvocab 342260 -k 100
Log-likelihood shows a decrease at iterations 60 and 70, after which it stops.