This change follows Algorithm 9 and Appendix C of E. Fox dissertation 2009.
The change reparameterized alpha and kappa as (alpha+kappa) and rho=kappa/(kappa+alpha). We place a Gamma prior over (alpha+kappa) and a Beta prior over rho. The WeakLimitStickyHDPHMM class is updated to allow the new hyper-parameters. Further, the new classes are added to the transitions.py class to include "FullConcGibbs" classes to indicate that we are now sampling over all of the hyper-parameters in the model (rather than just the alpha and gamma parameters as before).
I ran the updated model on example-data.txt following the code in hsmm.py (changing the relevant model to:
This change follows Algorithm 9 and Appendix C of E. Fox dissertation 2009.
The change reparameterized
alpha
andkappa
as(alpha+kappa)
andrho=kappa/(kappa+alpha)
. We place a Gamma prior over(alpha+kappa)
and a Beta prior overrho
. TheWeakLimitStickyHDPHMM
class is updated to allow the new hyper-parameters. Further, the new classes are added to the transitions.py class to include "FullConcGibbs" classes to indicate that we are now sampling over all of the hyper-parameters in the model (rather than just the alpha and gamma parameters as before).I ran the updated model on
example-data.txt
following the code inhsmm.py
(changing the relevant model to:The result is that we now have a posterior over kappa:![image](https://user-images.githubusercontent.com/3874391/58944328-45ee6a00-878a-11e9-9624-eb7644e514bd.png)
plt.hist([m.trans_distn.kappa for m in models])
:I'd very much appreciate it if you have any comments/suggestions/reviews.
Thanks very much