Closed jmgo closed 7 years ago
Yes, it does (though if the log probabilities of a particular observation are -inf for every state then the behavior inherits the behavior of Eigen's maxCoeff()
).
Here's a test:
from __future__ import division
import numpy as np
from pyhsmm import models as m, distributions as d
hmm = m.HMM(
obs_distns=[d.Categorical(weights=row) for row in np.eye(2)],
trans_matrix=np.ones((2,2))/2.,
pi_0=np.array([1.,0.]))
s = hmm.add_data(np.tile([0,1], 10))
s.Viterbi()
print s.stateseq
print s.aBl # aBl[t,i] is log likelihood of observation t under state i
Hi!
As I mentioned in another issue, I have implemented a simple custom observational distribution, with the code:
My doubt is: can the Viterbi algorithm handle log_likelihoods = -np.Inf? This happens when prob is 0 (log(0) = -Inf).