dfm / george

Fast and flexible Gaussian Process regression in Python
http://george.readthedocs.io
MIT License
451 stars 128 forks source link

Bayesian model comparison #116

Open stefanocovino opened 5 years ago

stefanocovino commented 5 years ago

Hi,

I am trying to derive some inference from regressions based on different kernel combinations, and I am doing so computing the Bayes factors (indeed by means of a Parallel-Tempering Ensemble MCMC to compute the evidence). This is fine however, as far as I have understood the matter, this requires to add priors (when they are needed) properly normalised. I just wonder whether the bounds one can introduce in the kernel definitions behave like flat priors or it is anyway better to explicitly define these priors and take care of their normalisations.

Thanks, Stefano

dfm commented 5 years ago

Good question! The priors returned by george are not properly normalized so you'd be better off doing it yourself.

stefanocovino commented 5 years ago

Thanks! Good to know.

A kind suggestion. Adding to the docs an example with a properly written prior using george could anyway be useful, I think.

Bye, Stefano