bilby-dev / bilby

A unified framework for stochastic sampling packages and gravitational-wave inference in Python. Note that we are currently transitioning from git.ligo.org/lscsoft/bilby, please bear with us!
https://bilby-dev.github.io/bilby/
MIT License
59 stars 66 forks source link

Improve perfomance of the normalization factor estimation for constraint priors #835

Open JasperMartins opened 1 week ago

JasperMartins commented 1 week ago

Currently, the estimation/integration of the normalization factor for constraint priors performs a fairly simple Monte Carlo integration that stops once a target number of accepted samples has been produced. This has two issues:

  1. If the constraint only removes a small part of the unconstrained volume, the number of accepted samples is reached comparatively fast. Thus, the total number of proposed samples will be small, leading to larger variances in the quality of the integral compared to constraints that remove, say, half of the prior volume.
  2. On the flip side, if the constraint removes almost all of the prior volume, the integration routine will take a long time to converge to the target number of samples. This case is somewhat artificial since, for such priors, a different parametrization should probably be used to improve sampling efficiency anyway, but especially in very high dimensions, the prior volume removed by a constraint might be significantly larger than naively expected.

For these reasons, I propose switching to an off-the-shelf stochastic integration routine and to check the integration error, for instance the qmc_quad routine implemented by scipy. Alternatively, one could at max_iterargument, or similar, to fix excessive routimes in case 2. If such changes are up for consideration, I would go ahead on an implementation.

ColmTalbot commented 6 days ago

Thanks for the suggestion, I'd be happy to see a PR!