lindermanlab / ssm

Bayesian learning and inference for state space models
MIT License
574 stars 201 forks source link

Is it possible to fit states to more than one series at a time? #37

Closed ewerlopes closed 5 years ago

ewerlopes commented 5 years ago

Hi,

Thanks for sharing this library. I was wondering whether we can fit the states on a collection of time series instead of just one as you do on the examples.

Thanks.

bagibence commented 5 years ago

Yes, I've been using SLDS on a list of recordings. You just have to pass a list of time series instead of a single one to the functions. The list elements have to be in the shape of T x D where T is the number of time points (the different time series can have different lengths) and D is the dimensionality of your observations.

slinderman commented 5 years ago

Thanks @bagibence, that's exactly right.

ewerlopes commented 5 years ago

@bagibence That is good news! I am going to try it right now! Thanks for the prompt reply!

ewerlopes commented 5 years ago

@slinderman, just one doubt... the fact that we see the ELBO not having a monotonically increasing behavior comes from the fact you are using the stochastic version for the mean field variational inference, right?

slinderman commented 5 years ago

The ELBO should monotonically increase for HMMs fit with EM. We've implemented exact M-steps for most observation models. For SLDS, the examples are currently using black box variational inference with SGD, Adam, rmsprop, etc. We've implemented a few variational families including mean field

q(x) = \prod_t N(x_t | mu_t, Sigma_t)

and a structured variational posterior

q(x) = N(x | mu, Sigma)

where mu is a (TD,) vector and Sigma is a (TD x TD) matrix with a block-tridiagonal inverse. I.e. the variational posterior corresponds to a chain structured graph, as in an LDS.

In a separate branch, David Z. and I are working on a Laplace variational inference method that maintains chain-structured posteriors on both q(x) and q(z), and is much more efficient than BBVI.