uqfoundation / pathos

parallel graph management and execution in heterogeneous computing
http://pathos.rtfd.io
Other
1.38k stars 89 forks source link

Multiple pickle failures in the pathos examples and tests #125

Closed Zebrafish007 closed 6 years ago

Zebrafish007 commented 6 years ago

Dear Mr. Mike McKerns,

I've encountered a lot of titties & beer errors in various provided scripts with the pathos example and test folder. They might be related to each other. Not sure. See attachment for tracebacks on each file I had a bump with. I hope you can help me out here... 7 cores are waiting to be fed with ... pickles.

Most reoccurring song theme (a.k.a. error messages) in the attached bug-file:

Ill give you two clues. Let go of your Pickle

  • What? Let go of your cPickle!
  • I'm not holding my cPickle.PicklingError Well, whos holding your pickle error then? --> (attached bug file),..
  • I don't know... She "dills" out in the audience... --> python, conda or sir pickle-lot... i guess

(... Frank Zappa)

--- some sys and install info ---

via pip : pathos 0.2.1 & multiprocess (0.70.5), via conda: dill 0.2.7.1. Anaconda2; python 2.7.13, windows 10. komodo edit 10.2.3 and 11.0.

These pickle errors also work on the StackOverflow example when the defs are placed in a class:

https://stackoverflow.com/questions/20887555/dead-simple-example-of-using-multiprocessing-queue-pool-and-locking

...and your answer: https://stackoverflow.com/a/21345273

pathos errors (dir_examples_and_dir_test).txt


snippet scriptcode

class Multiprocess(object):

def __init__(self):
    pass

def qmp_worker(self,(inputs, the_time)):
    print " Processs %s\tWaiting %s seconds" % (inputs, the_time)
    time.sleep(int(the_time))
    print " Process %s\tDONE" % inputs

def qmp_handler(self):                           # Non tandem pair processing
    pool = pp.ProcessPool(2)
    pool.map(self.qmp_worker, data)

def mp_worker((inputs, the_time)):
    print " Processs %s\tWaiting %s seconds" % (inputs, the_time)
    time.sleep(int(the_time))
    print " Process %s\tDONE" % inputs

def mp_handler():                           # Non tandem pair processing
    p = multiprocessing.Pool(2)
    p.map(mp_worker, data)

def mp_handler_tandem():
    subdata = zip(data[0::2], data[1::2])
    print subdata
    for task1, task2 in subdata:
        p = multiprocessing.Pool(2)
        p.map(mp_worker, (task1, task2))

#data = (['a', '1'], ['b', '2'], ['c', '3'], ['d', '4'])
data = (['a', '2'], ['b', '3'], ['c', '1'], ['d', '4'], ['e', '1'], ['f', '2'], ['g', '3'], ['h', '4'])

if __name__ == '__main__':
    mp_handler()
    mp_handler_tandem()

    Multiprocess().qmp_handler()
Zebrafish007 commented 6 years ago

I've stumbled upon something that might be an answer to the pickle issue here at had.. not sure though.

In the case of PyQt5 not reporting back errors as we used to see for "traceback" result in standard out there is an implementation to catch this given bu Vanloc on StackOverflow. The link:

https://stackoverflow.com/a/43039363

Here Vanloc implements:

To catch the exceptions, you need to overwrite the sys exception handler:

# Back up the reference to the exceptionhook
sys._excepthook = sys.excepthook

def my_exception_hook(exctype, value, traceback):
    # Print the error and traceback
    print(exctype, value, traceback)
    # Call the normal Exception hook after
    sys._excepthook(exctype, value, traceback)
    sys.exit(1)

# Set the exception hook to our wrapping function
sys.excepthook = my_exception_hook

Then in your execution code, wrap it in a try/catch.

try:
    sys.exit(app.exec_())
except:
    print("Exiting")

So the question is... should something similar be done with pickle and dill like above code (condensed below) given by vanloc in response to the issue T4ng10r had with his Nvidea driver and PyQt5 not reporting back in a pythonic way?

    # Back up the reference to the exceptionhook
    sys._excepthook = sys.excepthook

    # Set the exception hook to our wrapping function
    sys.excepthook = my_exception_hook
Zebrafish007 commented 6 years ago

On windwos 10 your mp_example.py script code produces the below error:

import dill
import pickle
import multiprocessing
from pathos.pools import ProcessPool, ThreadPool
import pathos.helpers
from multiprocessing import freeze_support
import logging
log = logging.getLogger(__name__)

class PMPExample(object):
    def __init__(self):
        self.cache = {}

    def compute(self, x):
        self.cache[x] = x ** 3
        return self.cache[x]

    def threadcompute(self, xs):
        pool = ThreadPool(4)
        results = pool.map(self.compute, xs)
        return results

    def processcompute(self, xs):
        pool = ProcessPool(4)
        results = pool.map(self.compute, xs)
        return results

def parcompute_example():
    dc = PMPExample()
    dc2 = PMPExample()
    dc3 = PMPExample()
    dc4 = PMPExample()

    n_datapoints = 100
    inp_data = range(n_datapoints)
    r1 = dc.threadcompute(inp_data)
    assert(len(dc.cache) == n_datapoints)

    r2 = dc2.processcompute(inp_data)
    assert(len(dc2.cache) == 0)
    assert(r1 == r2)

    r3 = ProcessPool(4).map(dc3.compute, inp_data)
    r4 = ThreadPool(4).map(dc4.compute, inp_data)
    assert(r4 == r3 == r2)
    assert(len(dc3.cache) == 0)
    assert(len(dc4.cache) == n_datapoints)

    log.info("Size of threadpooled class caches: {0}, {1}".format(len(dc.cache), len(dc4.cache)))
    log.info("Size of processpooled class caches: {0}, {1}".format(len(dc2.cache), len(dc3.cache)))

if __name__ == '__main__':
    pathos.helpers.freeze_support()
    logging.basicConfig()
    log.setLevel(logging.INFO)

    parcompute_example()`

The error:

Traceback (most recent call last): File "D:\Scripts\Thirdparty\Pathos\mp_class_example.py", line 64, in parcompute_example() File "D:\Scripts\Thirdparty\Pathos\mp_class_example.py", line 46, in parcompute_example r2 = dc2.processcompute(inp_data) File "D:\Scripts\Thirdparty\Pathos\mp_class_example.py", line 32, in processcompute results = pool.map(self.compute, xs) File "c:\python\anac2\lib\site-packages\pathos\multiprocessing.py", line 137, in map return _pool.map(star(f), zip(*args)) # chunksize File "c:\python\anac2\lib\site-packages\multiprocess\pool.py", line 251, in map return self.map_async(func, iterable, chunksize).get() File "c:\python\anac2\lib\site-packages\multiprocess\pool.py", line 567, in get raise self._value cPickle.PicklingError: Can't pickle <type 'function'>: attribute lookup builtin.function failed

As you can see #118 did not solve the issue at hand. What else to try?

mmckerns commented 6 years ago

I have tried your script on Windows 10, using the most recent version (from GitHub) of dill and pathos, using python 2.7.10. It's not the exact same as your versions, but I don't see any errors with my configuration. I'll need to do a little more work to match your versions... but maybe you can try with the most recent versions of the code and see if the error you see still persists?

Another thing we can try is to simplify the example (or examples), and then run with dill.detect.trace(True). Then compare the trace to see where mine succeeds and yours fails.

Zebrafish007 commented 6 years ago

I'll try the dill detect tonight and provide you with feedback thereafter.

Waponiwoo commented 6 years ago

getting same error in a conda environment using spyder. windows server 2012 R2 Standard here however. here is the code, a simplified version of one of your examples:

import dill
dill.detect.trace(True)

def host(x):
    return x*x

if __name__ == '__main__':
    from pathos.helpers import freeze_support
    freeze_support()

    from pathos.pools import ProcessPool as Pool
    pool = Pool()

    pool.ncpus = 2
    res3 = pool.map(host, range(5))
    print(pool)
    print('\n'.join(res3))
    print('')

output:

import dill
dill.detect.trace(True)

def host(x):
    return x*x

if __name__ == '__main__':
    from pathos.helpers import freeze_support
    freeze_support()

    from pathos.pools import ProcessPool as Pool
    pool = Pool()

    pool.ncpus = 2
    res3 = pool.map(host, range(5))
    print(pool)
    print('\n'.join(res3))
    print('')
D2: <dict object at 0x0000000007F3A8C8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F3A598>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F3A6A8>
# D2
# D2
D2: <dict object at 0x0000000007F49158>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F3AAE8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F49048>
# D2
# D2
D2: <dict object at 0x0000000007F4BAE8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F42BF8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F4B9D8>
# D2
# D2
D2: <dict object at 0x0000000007F50598>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F42D08>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F50488>
# D2
# D2
D2: <dict object at 0x0000000007F51158>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F3A7B8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F51048>
# D2
# D2
D2: <dict object at 0x0000000007F537B8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F3FAE8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F536A8>
# D2
# D2
D2: <dict object at 0x0000000007F539D8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F4D7B8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F538C8>
# D2
# D2
D2: <dict object at 0x0000000007F53BF8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F3FD08>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F53AE8>
# D2
# D2
D2: <dict object at 0x0000000007F53E18>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F51D08>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F53D08>
# D2
# D2
D2: <dict object at 0x0000000007F56158>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F51AE8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F56048>
# D2
# D2
D2: <dict object at 0x0000000007F56378>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F50BF8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F56268>
# D2
# D2
D2: <dict object at 0x0000000007F56598>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F4D048>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F56488>
# D2
# D2
D2: <dict object at 0x0000000007F567B8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F4B158>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F566A8>
# D2
# D2
D2: <dict object at 0x0000000007F57158>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F4B268>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F57048>
# D2
# D2
D2: <dict object at 0x0000000007F57378>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F496A8>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F57268>
# D2
# D2
D2: <dict object at 0x0000000007F57598>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F51488>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F57488>
# D2
# D2
D2: <dict object at 0x0000000007F466A8>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F50158>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F449D8>
# D2
# D2
D2: <dict object at 0x0000000007F46048>
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
# D2
T4: <class 'multiprocess.process.Process'>
# T4
D2: <dict object at 0x0000000007F4B488>
F2: <function worker at 0x0000000007E2C518>
# F2
T4: <class 'multiprocess.queues.SimpleQueue'>
# T4
T4: <type '_multiprocessing.PipeConnection'>
# T4
T4: <class 'multiprocess.synchronize.Lock'>
# T4
T4: <class 'multiprocess.process.AuthenticationString'>
# T4
D2: <dict object at 0x0000000007F467B8>
# D2
# D2
Traceback (most recent call last):

  File "<ipython-input-1-5c15322e47d7>", line 16, in <module>
    res3 = pool.map(host, range(5))

  File "D:\Anaconda2\envs\Everything_1_22_18\lib\site-packages\pathos\multiprocessing.py", line 137, in map
    return _pool.map(star(f), zip(*args)) # chunksize

  File "D:\Anaconda2\envs\Everything_1_22_18\lib\site-packages\multiprocess\pool.py", line 251, in map
    return self.map_async(func, iterable, chunksize).get()

  File "D:\Anaconda2\envs\Everything_1_22_18\lib\site-packages\multiprocess\pool.py", line 567, in get
    raise self._value

PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed

Env dump:

# packages in environment at D:\Anaconda2\envs\Everything_1_22_18:
#
alabaster                 0.7.10                   py27_0  
appdirs                   1.4.3                     <pip>
arrow                     0.10.0                    <pip>
asn1crypto                0.23.0                    <pip>
astroid                   1.5.3                    py27_0  
babel                     2.5.0                    py27_0  
backports                 1.0                      py27_0  
backports_abc             0.5                      py27_0  
bleach                    1.5.0                    py27_0  
blinker                   1.4                       <pip>
certifi                   2016.2.28                py27_0  
cffi                      1.11.2                    <pip>
chardet                   3.0.4                    py27_0  
colorama                  0.3.9                    py27_0  
configparser              3.5.0                    py27_0  
cryptography              2.1.3                     <pip>
decorator                 4.1.2                    py27_0  
dill                      0.2.7.1                   <pip>
docutils                  0.14                     py27_0  
entrypoints               0.2.3                    py27_0  
enum34                    1.1.6                    py27_0  
funcsigs                  1.0.2                     <pip>
functools32               3.2.3.2                  py27_0  
functools_lru_cache       1.4                      py27_0  
future                    0.16.0                    <pip>
get_terminal_size         1.0.0                    py27_0  
html5lib                  0.999                    py27_0  
icu                       57.1                      vc9_0  [vc9]
idna                      2.6                       <pip>
imagesize                 0.7.1                    py27_0  
ipaddress                 1.0.18                    <pip>
ipykernel                 4.6.1                    py27_0  
ipython                   5.3.0                    py27_0  
ipython_genutils          0.2.0                    py27_0  
isort                     4.2.15                   py27_0  
jdcal                     1.3                       <pip>
jedi                      0.10.2                   py27_2  
jinja2                    2.9.6                    py27_0  
jpeg                      9b                        vc9_0  [vc9]
jsonschema                2.6.0                    py27_0  
jupyter_client            5.1.0                    py27_0  
jupyter_core              4.3.0                    py27_0  
lazy-object-proxy         1.3.1                    py27_0  
libpng                    1.6.30                    vc9_1  [vc9]
markupsafe                1.0                      py27_0  
mistune                   0.7.4                    py27_0  
mkl                       2017.0.3                      0  
mock                      2.0.0                     <pip>
multiprocess              0.70.5                    <pip>
nbconvert                 5.2.1                    py27_0  
nbformat                  4.4.0                    py27_0  
nose                      1.3.7                     <pip>
ntlm-auth                 1.0.6                     <pip>
numpy                     1.13.1                   py27_0  
numpydoc                  0.7.0                    py27_0  
openssl                   1.0.2l                    vc9_0  [vc9]
osisoftpy                 2.3.5                     <pip>
pandas                    0.20.3                   py27_0  
pandocfilters             1.4.2                    py27_0  
path.py                   10.3.1                   py27_0  
pathlib2                  2.3.0                    py27_0  
pathos                    0.2.1                     <pip>
patsy                     0.4.1                    py27_0  
pbr                       3.1.1                     <pip>
pickleshare               0.7.4                    py27_0  
pip                       9.0.1                    py27_1  
ply                       3.10                      <pip>
pox                       0.2.3                     <pip>
ppft                      1.6.4.7.1                 <pip>
prompt_toolkit            1.0.15                   py27_0  
psutil                    5.2.2                    py27_0  
pycodestyle               2.3.1                    py27_0  
pycparser                 2.18                      <pip>
pyflakes                  1.6.0                    py27_0  
pygments                  2.2.0                    py27_0  
pylint                    1.7.2                    py27_0  
pymssql                   2.1.3                     <pip>
pyodbc                    4.0.21                    <pip>
Pyomo                     5.3                       <pip>
pyqt                      5.6.0                    py27_2  
pyreadline                2.1                       <pip>
python                    2.7.13                        1  
python-dateutil           2.6.1                    py27_0  
pytz                      2017.2                   py27_0  
PyUtilib                  5.6                       <pip>
pyzmq                     16.0.2                   py27_0  
qt                        5.6.2                     vc9_6  [vc9]
qtawesome                 0.4.4                    py27_0  
qtconsole                 4.3.1                    py27_0  
qtpy                      1.3.1                    py27_0  
requests                  2.14.2                   py27_0  
requests-kerberos         0.11.0                    <pip>
requests-ntlm             1.1.0                     <pip>
rope                      0.9.4                    py27_1  
scandir                   1.5                      py27_0  
scipy                     0.19.1              np113py27_0  
setuptools                36.4.0                   py27_0  
simplegeneric             0.8.1                    py27_1  
singledispatch            3.4.0.3                  py27_0  
sip                       4.18                     py27_0  
six                       1.10.0                   py27_0  
snowballstemmer           1.2.1                    py27_0  
sphinx                    1.6.3                    py27_0  
sphinxcontrib             1.0                      py27_0  
sphinxcontrib-websupport  1.0.1                    py27_0  
spyder                    3.2.3                    py27_0  
ssl_match_hostname        3.5.0.1                  py27_0  
statsmodels               0.8.0               np113py27_0  
testpath                  0.3.1                    py27_0  
tornado                   4.5.2                    py27_0  
traitlets                 4.3.2                    py27_0  
typing                    3.6.2                    py27_0  
vs2008_runtime            9.00.30729.5054               0  
wcwidth                   0.1.7                    py27_0  
wheel                     0.29.0                   py27_0  
win_unicode_console       0.5                      py27_0  
wincertstore              0.2                      py27_0  
winkerberos               0.7.0                     <pip>
wrapt                     1.10.11                  py27_0  
zlib                      1.2.11                    vc9_0  [vc9]
mmckerns commented 6 years ago

@Waponiwoo: Are you using python 2.7... and if so, can you confirm that you have a C compiler? Because if you don't have a C compiler, then multiprocess may not have built correctly and you might see the error you have posted. I run your example code, and it works for me.

@Zebrafish007: Are you still having issues, or can this be closed? I don't get your error either, and it may be due to the same reason... that you are missing a C compiler. I also tried your initial snippet and the code wasn't valid... so I edited it and now it works. I also didn't have a problem with it running once edited:

from pathos import multiprocessing, pools as pp
import time

class Multiprocess(object):

    def __init__(self):
        pass

    def qmp_worker(self,(inputs, the_time)):
        print " Processs %s\tWaiting %s seconds" % (inputs, the_time)
        time.sleep(int(the_time))
        print " Process %s\tDONE" % inputs

    def qmp_handler(self):                      # Non tandem pair processing
        pool = pp.ProcessPool(2)
        pool.map(self.qmp_worker, data)

def mp_worker((inputs, the_time)):
    print " Processs %s\tWaiting %s seconds" % (inputs, the_time)
    time.sleep(int(the_time))
    print " Process %s\tDONE" % inputs

def mp_handler():                           # Non tandem pair processing
    p = multiprocessing.Pool(2)
    p.map(mp_worker, data)

def mp_handler_tandem():
    subdata = zip(data[0::2], data[1::2])
    print subdata
    for task1, task2 in subdata:
        p = multiprocessing.Pool(2)
        p.map(mp_worker, (task1, task2))

#data = (['a', '1'], ['b', '2'], ['c', '3'], ['d', '4'])
data = (['a', '2'], ['b', '3'], ['c', '1'], ['d', '4'], ['e', '1'], ['f', '2'], ['g', '3'], ['h', '4'])

if __name__ == '__main__':
    mp_handler()
    mp_handler_tandem()

    Multiprocess().qmp_handler()
mmckerns commented 6 years ago

Since I can't reproduce the errors, and both reporters tracebacks have the signature of missing a C compiler, I'm going to close this issue. If either of you do have a C compiler, and are still experiencing issues, then please reopen the ticket.

gziras-in commented 2 years ago

227 It seems that there exists the same issue.