parsing-science / pymc3_models

Apache License 2.0
158 stars 24 forks source link

Injecting priors #2

Closed rlouf closed 2 years ago

rlouf commented 6 years ago

One of the strength of Bayesian Analysis is that you can adapt the choice of priors to the study, or make it non informative if you have no a priori knowledge.

I think it would make sense to separate the prior specification from the model itself. I am not very familiar with the internals of pyMC3, but maybe one could define the prior as a model and pass this model as a variable. This would allow flexibility on the choice of priors while still not having to specify the whole model. Of course, one can keep sensible defaults for people who do not have a preference, or don't want to bother (probably non informative priors).

This goes even further than just specifying the parameters of a prior distribution (as in “i know the slope of my linear regression should be around 3.2 plus or minus 1.5”)., and this is why I suggest model injection. For instance, in Gaussian Mixture Models, you could specify K the number of centers. Or you may want to go nonparametric and say K is distributed according to a Poisson with parameter \lambda and with whatever Prior on \lambda. Whether you choose one or the other depends on your modeling situation, but from the library owner's perspective you don’t want to have to write two different models for these situations because, in essence, it is just a GMM.

parsing-science commented 6 years ago

I completely agree. This initial release was meant to just get something out in the world so that people could make suggestions. If you want to open up a PR to inject priors, I'm happy to look at it!

rlouf commented 6 years ago

I'll look into it when it becomes relevant!