EasyLiu-Ly / jahmm

Automatically exported from code.google.com/p/jahmm
BSD 3-Clause "New" or "Revised" License
0 stars 0 forks source link

Compute likelihood during iteration / allow convergence to a absolute or relative tolerance. #30

Open GoogleCodeExporter opened 8 years ago

GoogleCodeExporter commented 8 years ago
What steps will reproduce the problem?

It seems there is no way to control the iteration of the Baum-Welch learner to 
compute the likelihood of the current parameters as the algorithm iterates. 
This is aggravated by the fact that there is there is no straightforward way to 
write the likelihood function in a subclass because the relevant parameters are 
buried deep inside the iterate function. 

What is the expected output? What do you see instead?

Most learning algorithms allow the likelihood to be computed during each 
iteration. This allows local convergence to be perfectly checked during EM, as 
well as verify correctness because it should increase monotonically. The KL 
measurement thingy is only a crude approximation to this.

What version of the product are you using? On what operating system?

jahmm 0.6.2 on Ubuntu 12.04

Please provide any additional information below.

I wish you guys would host this project on GitHub instead of Google code. Then 
it would be easier for people like me to fork the project, implement things 
like this, and then send a pull request with updates.

Original issue reported on code.google.com by miz...@gmail.com on 16 Sep 2013 at 2:16