mattjj / pyhsmm

MIT License
545 stars 172 forks source link

Custom observational distribution #69

Closed jmgo closed 7 years ago

jmgo commented 7 years ago

Hi!

As I mentioned in another issue, I have implemented a simple custom observational distribution, with the code:

import numpy as np
class ClassifierDist(object):

    def __init__(self, clf, state):
        self.clf   = clf
        self.state = state 

    def log_likelihood(self,x):
        prob = self.clf.predict_proba(x)[:,self.state]
        prob_log = np.log(prob)
        return prob_log 

My doubt is: can the Viterbi algorithm handle log_likelihoods = -np.Inf? This happens when prob is 0 (log(0) = -Inf).

mattjj commented 7 years ago

Yes, it does (though if the log probabilities of a particular observation are -inf for every state then the behavior inherits the behavior of Eigen's maxCoeff()).

Here's a test:

from __future__ import division
import numpy as np

from pyhsmm import models as m, distributions as d

hmm = m.HMM(
    obs_distns=[d.Categorical(weights=row) for row in np.eye(2)],
    trans_matrix=np.ones((2,2))/2.,
    pi_0=np.array([1.,0.]))

s = hmm.add_data(np.tile([0,1], 10))
s.Viterbi()
print s.stateseq
print s.aBl  # aBl[t,i] is log likelihood of observation t under state i