Closed sandyhsia closed 6 years ago
With regards to the variable name, i.e. error
being misleading, you are right. I do not have bandwidth right now to fix it, but if you submit a pull request with the appropriate changes, I would be happy to review and approve it. Just make sure to branch off development
.
With regards to sigma2, yes, one could probably use that as an estimate of the GMMs uncertainty for drawing target points. I do not quite remember, but when I was a graduate student I had trouble making a variance based stopping criteria and just comparing q to q-prev seemed to provide better results.
Hi, In the bunny example, I found "Error" in the visualization is misleading, because it is increasing all the time, and a the last iteration, error suddenly becomes zero. for example, iteration 20:
iteration 39
when I check the source code and the paper, it sounds like error means the difference between Q (objective function) in current iteration and previous iteration. For example, in rigid_registration.py:
Given the
np.abs()
, we even can't figure out whether the Q is actually increasing?? or decreasing. Q might converge through EM algorithm, soerror (actually the difference) < tolerance
might be a metric to see whether it converges. But I hope the name of the variable could be clearer.Second question is, while optimizing Q, is the sigma2 also decreasing? and thus could be a metric to see whether it converges?