jmschrei / yahmm

Yet Another Hidden Markov Model repository.
MIT License
249 stars 32 forks source link

Tied Edges #26

Closed jmschrei closed 10 years ago

jmschrei commented 10 years ago

When the edges leaving a state should have the same respective probabilities as the edges leaving another state after training, the edges are 'tied'. This can be useful if you're training a repeated structure like a global sequence alignment HMM, and want all of the transitions in each repeated subunit to be the same as the respective transitions of the others, because the probability of deleting or inserting at each position should be the same.

I am unsure how we want to specify that an edge is tied together. Distributions were easy to tie together using the same underlying distribution object.

jmschrei commented 10 years ago

I've been thinking about it, and I think what we might do is have a function called Model.add_tied_transitions, where you pass in a list of lists of of tuples, where each tuple represents an edge in the format ( from, to, probability ), each inner list represents all the transitions out of that from state, and the outer list represents all the states you wish to have states tied together with. In Baum Welch we could then just add together all of the tied states to each other, just like we do with tied distributions. Thoughts?

jmschrei commented 10 years ago

I think a better way of handling this might be to just add in a parameter named group to the add_transition method, and all edges with the same group get trained together. This would mean just that the expected numbers is calculated on a per-group basis, not on a per-edge basis.

jmschrei commented 10 years ago

Added.