dfm / emcee

The Python ensemble sampling toolkit for affine-invariant MCMC
https://emcee.readthedocs.io
MIT License
1.46k stars 431 forks source link

Error when using MPI #244

Open sshres opened 6 years ago

sshres commented 6 years ago

I'm only seeing this error at the end when using MPIPool in the emcee sampler, it works fine with a multiprocessing pool.

> Traceback (most recent call last):
>   File "mpiemcee.py", line 121, in <module>
>     pos, prob, state = sampler.run_mcmc(p0, 10)
>   File "/home/sash7676/anaconda2/lib/python2.7/site-packages/emcee/sampler.py", line 172, in run_mcmc
>     **kwargs):
>   File "/home/sash7676/anaconda2/lib/python2.7/site-packages/emcee/ensemble.py", line 198, in sample
>     lnprob, blobs = self._get_lnprob(p)
>   File "/home/sash7676/anaconda2/lib/python2.7/site-packages/emcee/ensemble.py", line 382, in _get_lnprob
>     results = list(M(self.lnprobfn, [p[i] for i in range(len(p))]))
> TypeError: 'NoneType' object is not iterable

Has anybody had this problem?

dfm commented 6 years ago

What version of emcee are you using? How did you install it? Please make a minimal working example that demonstrates this issue.

sshres commented 6 years ago

I used 'pip install emcee', version emcee-2.2.1. The code worked fine when I used the ProcessingPool from pathos.multiprocessing but since I wanted to run my code on a cluster, I thought I'd use MPI which when I get the above posted error. Here's the full code: https://github.com/sshres/CoupledModel/blob/master/mpiemcee_coupled.py since the log posterior was a bit complicated to post here.

dfm commented 6 years ago

Can you please write the minimal working code snippet? It shouldn't depend on your data or your specific model. That way we can nail down where the issue is coming from. You should also say exactly what command line argument you're executing to run your script with MPI.

Also: what does emcee.version say? And how did you install mpi4py? What version of mpi4py is it? On Thu, Nov 9, 2017 at 11:40 PM sshres notifications@github.com wrote:

I used 'pip install emcee' today. The code worked fine when I used the ProcessingPool from pathos.multiprocessing but since I wanted to run my code on a cluster, I thought I'd use MPI which when I get the above posted error. Here's the full code: https://github.com/sshres/CoupledModel/blob/master/mpiemcee_coupled.py since the log posterior was a bit complicated to post here.

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/dfm/emcee/issues/244#issuecomment-343373247, or mute the thread https://github.com/notifications/unsubscribe-auth/AAVYSme64_8coO6KFoMwvdw2JFp823GGks5s09PNgaJpZM4QY77w .

dfm commented 6 years ago

Also: you might want to double check the emcee MPI docs: http://emcee.readthedocs.io/en/stable/user/advanced.html#using-mpi-to-distribute-the-computations On Fri, Nov 10, 2017 at 6:31 AM Dan Foreman-Mackey foreman.mackey@gmail.com wrote:

Can you please write the minimal working code snippet? It shouldn't depend on your data or your specific model. That way we can nail down where the issue is coming from. You should also say exactly what command line argument you're executing to run your script with MPI.

Also: what does emcee.version say? And how did you install mpi4py? What version of mpi4py is it? On Thu, Nov 9, 2017 at 11:40 PM sshres notifications@github.com wrote:

I used 'pip install emcee' today. The code worked fine when I used the ProcessingPool from pathos.multiprocessing but since I wanted to run my code on a cluster, I thought I'd use MPI which when I get the above posted error. Here's the full code: https://github.com/sshres/CoupledModel/blob/master/mpiemcee_coupled.py since the log posterior was a bit complicated to post here.

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/dfm/emcee/issues/244#issuecomment-343373247, or mute the thread https://github.com/notifications/unsubscribe-auth/AAVYSme64_8coO6KFoMwvdw2JFp823GGks5s09PNgaJpZM4QY77w .