tensorflow / probability

Probabilistic reasoning and statistical analysis in TensorFlow
https://www.tensorflow.org/probability/
Apache License 2.0
4.16k stars 1.08k forks source link

How to constrain initial_weights_prior being positive in tfp.DynamicLinearRegression #1096

Open yuming-p opened 3 years ago

yuming-p commented 3 years ago

Hi all,

I am building a structural time series for causal analysis. I would like to enforce the weights in DynamicLinearRegression (estimated by variational inference) to be non-negative due to interpretation reasons.

The argument ‘inital_weights_prior’ in DynamicLinearRegression is required to be an instance of tfd.MultivariateNormal. I wonder if it’s possible to truncate tfd.MultivariateNormalDiag to be non-negative, or create a multivariate half-normal distribution with same format as a valid input for initial_weights_prior. It would be great if someone could enlighten me on achieving non-negative dynamic coefficients in this case.

jeffpollock9 commented 3 years ago

Hello,

The (dynamic) weights in DynamicLinearRegression must be Gaussian so a LinearGaussianStateSpaceModel can be formed. The weights in this model are estimated using the Kalman filter.

If you don't need dynamic weights you could look at using LinearRegression which has static weights which don't have the Gaussian restriction and can be estimated with variational inference (or HMC etc).

If you do need dynamic and non-Gaussian weights, I think you could look into the (currently experimental) particle filtering stuff:

https://www.tensorflow.org/probability/api_docs/python/tfp/experimental/mcmc/particle_filter

Although I have unfortunately never tried it.

Hopefully someone else can pipe in with some more useful information - thought I'd just share the little bits that I (think I) know.