CamDavidsonPilon / Probabilistic-Programming-and-Bayesian-Methods-for-Hackers

aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)
http://camdavidsonpilon.github.io/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/
MIT License
26.8k stars 7.88k forks source link

Issues running sample code in Chapter 1 Introduction: #426

Open IvoCrnkovic opened 5 years ago

IvoCrnkovic commented 5 years ago

When i run Ch1_Introduction_PyMC3.ipynb sequentially everything works great until i get to the following line:

### Mysterious code to be explained in Chapter 3.
with model:
    step = pm.Metropolis()
    trace = pm.sample(10000, tune=5000,step=step)

Which breaks with:

Multiprocess sampling (4 chains in 4 jobs)
CompoundStep
>Metropolis: [tau]
>Metropolis: [lambda_2]
>Metropolis: [lambda_1]

BrokenProcessPoolTraceback (most recent call last)
<ipython-input-18-09691883d70f> in <module>()
      2 with model:
      3     #step = pm.Metropolis()
----> 4     trace = pm.sample(10000, tune=5000,step=step)

/usr/local/lib/python2.7/dist-packages/pymc3/sampling.pyc in sample(draws, step, init, n_init, start, trace, chain_idx, chains, cores, tune, nuts_kwargs, step_kwargs, progressbar, model, random_seed, live_plot, discard_tuned_samples, live_plot_kwargs, compute_convergence_checks, use_mmap, **kwargs)
    447             _print_step_hierarchy(step)
    448             try:
--> 449                 trace = _mp_sample(**sample_args)
    450             except pickle.PickleError:
    451                 _log.warning("Could not pickle model, sampling singlethreaded.")

/usr/local/lib/python2.7/dist-packages/pymc3/sampling.pyc in _mp_sample(draws, tune, step, chains, cores, chain, random_seed, start, progressbar, trace, model, use_mmap, **kwargs)
   1029             traces = Parallel(n_jobs=cores)(jobs)
   1030         else:
-> 1031             traces = Parallel(n_jobs=cores, mmap_mode=None)(jobs)
   1032         return MultiTrace(traces)
   1033 

/usr/local/lib/python2.7/dist-packages/joblib/parallel.pyc in __call__(self, iterable)
    928 
    929             with self._backend.retrieval_context():
--> 930                 self.retrieve()
    931             # Make sure that we get a last message telling us we are done
    932             elapsed_time = time.time() - self._start_time

/usr/local/lib/python2.7/dist-packages/joblib/parallel.pyc in retrieve(self)
    831             try:
    832                 if getattr(self._backend, 'supports_timeout', False):
--> 833                     self._output.extend(job.get(timeout=self.timeout))
    834                 else:
    835                     self._output.extend(job.get())

/usr/local/lib/python2.7/dist-packages/joblib/_parallel_backends.pyc in wrap_future_result(future, timeout)
    519         AsyncResults.get from multiprocessing."""
    520         try:
--> 521             return future.result(timeout=timeout)
    522         except LokyTimeoutError:
    523             raise TimeoutError()

/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.pyc in result(self, timeout)
    431                     raise CancelledError()
    432                 elif self._state == FINISHED:
--> 433                     return self.__get_result()
    434                 else:
    435                     raise TimeoutError()

/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.pyc in __get_result(self)
    379         def __get_result(self):
    380             if self._exception:
--> 381                 raise self._exception
    382             else:
    383                 return self._result

BrokenProcessPool: A task has failed to un-serialize. Please ensure that the arguments of the function are all picklable.

This was caused directly by 
'''
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/process_executor.py", line 391, in _process_worker
    call_item = call_queue.get(block=True, timeout=timeout)
  File "/usr/lib/python2.7/multiprocessing/queues.py", line 135, in get
    res = self._recv()
  File "/usr/local/lib/python2.7/dist-packages/pymc3/step_methods/arraystep.py", line 39, in __new__
    model = modelcontext(kwargs.get('model'))
  File "/usr/local/lib/python2.7/dist-packages/pymc3/model.py", line 190, in modelcontext
    return Model.get_context()
  File "/usr/local/lib/python2.7/dist-packages/pymc3/model.py", line 182, in get_context
    raise TypeError("No context on context stack")
TypeError: No context on context stack
'''
IvoCrnkovic commented 5 years ago

Python 2.7.14:

import sys
sys.version
'2.7.14 (default, Sep 23 2017, 22:06:14) \n[GCC 7.2.0]'
IvoCrnkovic commented 5 years ago

I get the same error chapter 2 with the lines:

#include the observations, which are Bernoulli
with model:
    obs = pm.Bernoulli("obs", p, observed=occurrences)
    # To be explained in chapter 3
    step = pm.Metropolis()
    trace = pm.sample(18000, step=step)
    burned_trace = trace[1000:]

the error occurs at the specific statement

trace = pm.sample(18000, step=step)
herrmalte commented 4 years ago

When i run Ch1_Introduction_PyMC3.ipynb sequentially everything works great until i get to the following line:

### Mysterious code to be explained in Chapter 3.
with model:
    step = pm.Metropolis()
    trace = pm.sample(10000, tune=5000,step=step)

Which breaks with:

Multiprocess sampling (4 chains in 4 jobs)
CompoundStep
>Metropolis: [tau]
>Metropolis: [lambda_2]
>Metropolis: [lambda_1]

BrokenProcessPoolTraceback (most recent call last)
<ipython-input-18-09691883d70f> in <module>()
      2 with model:
      3     #step = pm.Metropolis()
----> 4     trace = pm.sample(10000, tune=5000,step=step)

/usr/local/lib/python2.7/dist-packages/pymc3/sampling.pyc in sample(draws, step, init, n_init, start, trace, chain_idx, chains, cores, tune, nuts_kwargs, step_kwargs, progressbar, model, random_seed, live_plot, discard_tuned_samples, live_plot_kwargs, compute_convergence_checks, use_mmap, **kwargs)
    447             _print_step_hierarchy(step)
    448             try:
--> 449                 trace = _mp_sample(**sample_args)
    450             except pickle.PickleError:
    451                 _log.warning("Could not pickle model, sampling singlethreaded.")

/usr/local/lib/python2.7/dist-packages/pymc3/sampling.pyc in _mp_sample(draws, tune, step, chains, cores, chain, random_seed, start, progressbar, trace, model, use_mmap, **kwargs)
   1029             traces = Parallel(n_jobs=cores)(jobs)
   1030         else:
-> 1031             traces = Parallel(n_jobs=cores, mmap_mode=None)(jobs)
   1032         return MultiTrace(traces)
   1033 

/usr/local/lib/python2.7/dist-packages/joblib/parallel.pyc in __call__(self, iterable)
    928 
    929             with self._backend.retrieval_context():
--> 930                 self.retrieve()
    931             # Make sure that we get a last message telling us we are done
    932             elapsed_time = time.time() - self._start_time

/usr/local/lib/python2.7/dist-packages/joblib/parallel.pyc in retrieve(self)
    831             try:
    832                 if getattr(self._backend, 'supports_timeout', False):
--> 833                     self._output.extend(job.get(timeout=self.timeout))
    834                 else:
    835                     self._output.extend(job.get())

/usr/local/lib/python2.7/dist-packages/joblib/_parallel_backends.pyc in wrap_future_result(future, timeout)
    519         AsyncResults.get from multiprocessing."""
    520         try:
--> 521             return future.result(timeout=timeout)
    522         except LokyTimeoutError:
    523             raise TimeoutError()

/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.pyc in result(self, timeout)
    431                     raise CancelledError()
    432                 elif self._state == FINISHED:
--> 433                     return self.__get_result()
    434                 else:
    435                     raise TimeoutError()

/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/_base.pyc in __get_result(self)
    379         def __get_result(self):
    380             if self._exception:
--> 381                 raise self._exception
    382             else:
    383                 return self._result

BrokenProcessPool: A task has failed to un-serialize. Please ensure that the arguments of the function are all picklable.

This was caused directly by 
'''
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/joblib/externals/loky/process_executor.py", line 391, in _process_worker
    call_item = call_queue.get(block=True, timeout=timeout)
  File "/usr/lib/python2.7/multiprocessing/queues.py", line 135, in get
    res = self._recv()
  File "/usr/local/lib/python2.7/dist-packages/pymc3/step_methods/arraystep.py", line 39, in __new__
    model = modelcontext(kwargs.get('model'))
  File "/usr/local/lib/python2.7/dist-packages/pymc3/model.py", line 190, in modelcontext
    return Model.get_context()
  File "/usr/local/lib/python2.7/dist-packages/pymc3/model.py", line 182, in get_context
    raise TypeError("No context on context stack")
TypeError: No context on context stack
'''

I get the same error.

Did anyone manage to solve this? @IvoCrnkovic

javabean68 commented 4 years ago

Hi, perhaps you should use Python 3.6? Bye Fabio

herrmalte commented 4 years ago

Thanks, will try that!

Den tis 26 nov. 2019 kl 11:53 skrev Fabio notifications@github.com:

Hi, perhaps you should use Python 3.6? Bye Fabio

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers/issues/426?email_source=notifications&email_token=ACYO3V6HFDMATEGPPZFRCSTQVT5Z7A5CNFSM4GIQJPV2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEFFS4XA#issuecomment-558575196, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACYO3V6PH23JFVNWOTT5TX3QVT5Z7ANCNFSM4GIQJPVQ .

-- Vänligen,

Christopher Dahlen