mattjj / pyhsmm

MIT License
545 stars 172 forks source link

Include resampling over (kappa+alpha) and rho=kappa/(kappa+alpha) in WeakLimitStickyHDPHMM #98

Open NickHoernle opened 5 years ago

NickHoernle commented 5 years ago

This change follows Algorithm 9 and Appendix C of E. Fox dissertation 2009.

The change reparameterized alpha and kappa as (alpha+kappa) and rho=kappa/(kappa+alpha). We place a Gamma prior over (alpha+kappa) and a Beta prior over rho. The WeakLimitStickyHDPHMM class is updated to allow the new hyper-parameters. Further, the new classes are added to the transitions.py class to include "FullConcGibbs" classes to indicate that we are now sampling over all of the hyper-parameters in the model (rather than just the alpha and gamma parameters as before).

I ran the updated model on example-data.txt following the code in hsmm.py (changing the relevant model to:

posteriormodel = pyhsmm.models.WeakLimitStickyHDPHMM(
                    gamma_a_0=1,
                    gamma_b_0=1/4,
                    alpha_kappa_a_0=1,
                    alpha_kappa_b_0=1/4,
                    rho_c_0=1,
                    rho_d_0=1,
                    init_state_concentration=1,
                    obs_distns=obs_distns)

The result is that we now have a posterior over kappa: plt.hist([m.trans_distn.kappa for m in models]): image

I'd very much appreciate it if you have any comments/suggestions/reviews.

Thanks very much