joshspeagle / dynesty

Dynamic Nested Sampling package for computing Bayesian posteriors and evidences
https://dynesty.readthedocs.io/
MIT License
357 stars 77 forks source link

New implementation of mlfriends #123

Closed JohannesBuchner closed 5 years ago

JohannesBuchner commented 5 years ago

See issue #121 .

For a simple gaussian likelihood on a unit cube::

  scales = np.array([0.1, 0.001, 0.3])
  def loglike(x):
      return -0.5 * (((x - 0.5) / scales)**2).sum()

The speed-up from simple scaling is already a factor of 4.

The clustering that detects disjoint ball groups helps scaling multimodal distributions. The following likelihood:

  scales = np.array([0.1, 0.001, 0.3])
  mu1 = np.array([0.3, 0.2, 0.3])
  mu2 = np.array([0.3, 0.4, 0.3])
  def loglike(x):
      l1 = -0.5 * (((x - mu1) / scales)**2).sum()
      l2 = -0.5 * (((x - mu2) / scales)**2).sum()
      return np.logaddexp(l1, l2)

Then sees another factor of 4 increase in efficiency.

The efficiency improvement will be arbitrarily high when scales[1] is set to lower values.

Reference is https://arxiv.org/abs/1707.04476

joshspeagle commented 5 years ago

Looks good! I'll try to merge this in when I have some time to implement this more thoroughly. Hopefully that'll be sooner rather than later.

JohannesBuchner commented 5 years ago

Hi @joshspeagle, could you please merge this. I would like to build some things on top of dynesty+mlfriends, but it's cumbersome to release without this PR merged in some form.

joshspeagle commented 5 years ago

Sorry for the delay. I was traveling the last two weeks and am currently occupied for Passover with my in-laws. I’ll do this first thing next week.

joshspeagle commented 5 years ago

Merging this in now. Will try to rework this to work with covariances shortly (see #121)