minaskar / zeus

⚡️ zeus: Lightning Fast MCMC ⚡️
https://zeus-mcmc.readthedocs.io/
GNU General Public License v3.0
225 stars 34 forks source link

Walkers Initialization in Basic Use Examples: Shouldn't these use Uniform Distribution? #7

Open AdityaSavara opened 3 years ago

AdityaSavara commented 3 years ago

In the basic use example ( https://zeus-mcmc.readthedocs.io/en/latest/ ) currently there is this line:

start = np.random.randn(nwalkers, ndim) https://zeus-mcmc.readthedocs.io/en/latest/index.html Also here: https://zeus-mcmc.readthedocs.io/en/latest/notebooks/datafit.html

I copied that for how zeus is initialized in my code. But isn’t it wrong? If you use standard distribution you are biasing the starting points. Even burn in probably should not completely remove the effects of bias.

I think that probably we probably want something like this:

start = 4*(np.random.rand(nwalkers, ndim)-0.5)

That way we get initialization from a bounded uniform distribution from -2 standard deviations to +2 standard deviations.

I have tried both ways in my code ( link ) and they give similar results.
I have switched to the uniform way in in my code. Like below.

walkerStartsFirstTerm = 4*(np.random.rand(nwalkers, numParameters)-0.5) 
walkerStartPoints = walkerStartsFirstTerm*std_prior + mean_prior

Edit: the FAQ suggests starting the walkers in a ball near the MAP, so maybe this is okay. https://zeus-mcmc.readthedocs.io/en/latest/faq.html Also, that may mean for some problems it is better to start with a posterior maximizing routine (such as Metropolis Hastings MCMC or Nelder-Mead) followed by Ensemble Slice Sampling.

yuanzunli commented 3 years ago

@minaskar Dear Minas, I just test zeus with my MCMC problem. It runs very well. But I have a question: it seems that zeus is very similar with another MCMC programmer 'emcee', including its function format, some basic usage... So what's the zeus's advantage over emcee?