dswah / sgcrfpy

SGCRFpy: Sparse Gaussian Conditional Random Fields in Python
MIT License
29 stars 5 forks source link

loss increases sometimes #22

Open dswah opened 8 years ago

dswah commented 8 years ago

on some real problems, we see that the negative log likelihood (with l1 penalty) increases. the increase happens as a result of the update to theta.

image

here is a plot of the difference in loss after the update to theta. it looks like sometimes the loss increases! image

but the lambda updates are protected by the backtracking: image

dswah commented 8 years ago

the loss should NEVER increase due to a Theta update because the objective is convex and we have a closed-form solution for the update.