TuringLang / Turing.jl

Bayesian inference with probabilistic programming.
https://turinglang.org
MIT License
2.03k stars 218 forks source link

Missing RJMCMC (reversible-jump) sampler #2023

Open Moelf opened 1 year ago

Moelf commented 1 year ago

Just wanna open an issue to track this

Red-Portal commented 1 year ago

What do you mean by track? Was there any plan to include an RJMCMC sampler at any point?

Moelf commented 1 year ago

no, but someone may also want this and this can track how many people may want it

bgroenks96 commented 1 year ago

I would be interested in it. I only heard of it recently from a paper, but it sounds like a nice workaround to the high computational cost of discrete model comparison (i.e. fitting the posterior for lots of different models w/ different parameter spaces). But practically, I would have no idea how to implement this in Turing.

Red-Portal commented 1 year ago

As long as you can infer which variable is the model indicator and the dimensionality of the parameters can change, one could, in principle, implement such a backend for Turing. I think I vaguely remember that DynamicPPL cannot handle dynamically changing dimensions, but maybe I'm wrong on this.

The thing about RJMCMC that always concerns me is that RJMCMC tries to estimate Bayes factors for the Metropolis-Hastings step. And since estimating Bayes factors is rarely a good idea, the cases where RJMCMC actually works are pretty limited. They are notoriously hard to diagnose, too. They might look like they are working, but it's hard to tell if it's not producing garbage.