uqfoundation / pathos

parallel graph management and execution in heterogeneous computing
http://pathos.rtfd.io
Other
1.38k stars 89 forks source link

TypeError: can't pickle SwigPyObject objects #148

Open Mykheievskyi opened 6 years ago

Mykheievskyi commented 6 years ago

Pathos can be support serialization class with tensorflow?

Code:

from pathos.multiprocessing import ProcessingPool
import time
import tensorflow as tf
import numpy as np

class A(object):
    def __init__(self):
        self.session = tf.Session()
        self.batch_size = 10
        self.pool = ProcessingPool(processes=self.batch_size)

    def _run_thread(self, blob):
        print('SHOULD BE SESSION.RUN')
        time.sleep(0.500)
        return blob[:, :, :, 0] * self.batch_size

    def run(self, blob):
        blob = np.expand_dims(blob, axis=0)
        blob_list = [blob[:, i, :, :] for i in range(self.batch_size)]
        predictions_list = self.pool.map(self._run_thread, blob_list)

        predictions = np.stack(predictions_list, axis=1)[0]
        return predictions 

if __name__ == "__main__":
    blob = np.ones((3, 180, 320, 6), dtype=np.float32)

    a = A()
    start = time.time()
    output = a.run(blob)
    print("time: {:.3f}".format(time.time() - start))

Log:

2018-08-23 12:03:32.418037: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA Traceback (most recent call last): File "/home/d_mykheievskyi/projects/raspberry/bs_new/test_multiprocesing.py", line 74, in output = a.run(blob) File "/home/d_mykheievskyi/projects/raspberry/bs_new/test_multiprocesing.py", line 39, in run predictions_list = self.pool.map(self._run_thread, blob_list) File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/pathos/multiprocessing.py", line 137, in map return _pool.map(star(f), zip(args)) # chunksize File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/multiprocess/pool.py", line 266, in map return self._map_async(func, iterable, mapstar, chunksize).get() File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/multiprocess/pool.py", line 644, in get raise self._value File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/multiprocess/pool.py", line 424, in _handle_tasks put(task) File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/multiprocess/connection.py", line 209, in send self._send_bytes(ForkingPickler.dumps(obj)) File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/multiprocess/reduction.py", line 53, in dumps cls(buf, protocol).dump(obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 408, in dump self.save(obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 744, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 729, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 729, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 1377, in save_function obj.dict), obj=obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 603, in save_reduce save(args) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 744, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 729, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 1120, in save_cell pickler.save_reduce(_create_cell, (f,), obj=obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 603, in save_reduce save(args) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 729, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 1069, in save_instancemethod0 pickler.save_reduce(MethodType, (obj.func, obj.self), obj=obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 603, in save_reduce save(args) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/opt/anaconda3/lib/python3.5/pickle.py", line 729, in save_tuple save(element) File "/opt/anaconda3/lib/python3.5/pickle.py", line 520, in save self.save_reduce(obj=obj, rv) File "/opt/anaconda3/lib/python3.5/pickle.py", line 627, in save_reduce save(state) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 893, in save_module_dict StockPickler.save_dict(pickler, obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 814, in save_dict self._batch_setitems(obj.items()) File "/opt/anaconda3/lib/python3.5/pickle.py", line 840, in _batch_setitems save(v) File "/opt/anaconda3/lib/python3.5/pickle.py", line 520, in save self.save_reduce(obj=obj, rv) File "/opt/anaconda3/lib/python3.5/pickle.py", line 627, in save_reduce save(state) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 893, in save_module_dict StockPickler.save_dict(pickler, obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 814, in save_dict self._batch_setitems(obj.items()) File "/opt/anaconda3/lib/python3.5/pickle.py", line 840, in _batch_setitems save(v) File "/opt/anaconda3/lib/python3.5/pickle.py", line 520, in save self.save_reduce(obj=obj, rv) File "/opt/anaconda3/lib/python3.5/pickle.py", line 627, in save_reduce save(state) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 893, in save_module_dict StockPickler.save_dict(pickler, obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 814, in save_dict self._batch_setitems(obj.items()) File "/opt/anaconda3/lib/python3.5/pickle.py", line 840, in _batch_setitems save(v) File "/opt/anaconda3/lib/python3.5/pickle.py", line 520, in save self.save_reduce(obj=obj, *rv) File "/opt/anaconda3/lib/python3.5/pickle.py", line 627, in save_reduce save(state) File "/opt/anaconda3/lib/python3.5/pickle.py", line 475, in save f(self, obj) # Call unbound method with explicit self File "/home/d_mykheievskyi/projects/raspberry/tensorflow1.10/lib/python3.5/site-packages/dill/_dill.py", line 893, in save_module_dict StockPickler.save_dict(pickler, obj) File "/opt/anaconda3/lib/python3.5/pickle.py", line 814, in save_dict self._batch_setitems(obj.items()) File "/opt/anaconda3/lib/python3.5/pickle.py", line 845, in _batch_setitems save(v) File "/opt/anaconda3/lib/python3.5/pickle.py", line 495, in save rv = reduce(self.proto) TypeError: can't pickle SwigPyObject objects

lw3259111 commented 5 years ago

@Mykheievskyi I meet the same question, have you solve it ? @mmckerns

7wdeepin commented 5 years ago

@Mykheievskyi @lw3259111 I met the same question, have you solve it?

NoushNabi commented 5 years ago

I have faced the same issue. Any suggestions? Thanks.

kazemSafari commented 5 years ago

Me too guys. Any WORKAROUND? Thanks

halessi commented 5 years ago

Anybody have some luck? Same traceback using python 3.6.0 and pathos 0.2.2.1.

ketan0 commented 5 years ago

I am facing the same issue.

dushiel commented 5 years ago

lol, i am facing the same issue

mmckerns commented 5 years ago

Yeah, the issue is obviously that dill doesn't know how to handle a SwigPyObject. I looked for a dill issue to point this to, but didn't see one after a cursory look -- one might need to be created. Note that if you can modify the class to have a __reduce__ method, or build a subclass that derives from object, or something like that... the class should serialize.

tagorepothuneedi commented 3 years ago

Yeah, the issue is obviously that dill doesn't know how to handle a SwigPyObject. I looked for a dill issue to point this to, but didn't see one after a cursory look -- one might need to be created. Note that if you can modify the class to have a __reduce__ method, or build a subclass that derives from object, or something like that... the class should serialize.

I'm seeing the same issue. I found no working solution for this. I was not able to find the high level implementation of the swigpyobject to mention its reference in my code. Neither I was able to find a workaround to share the objects with multiple subprocess without pickling. please let me know if anyone has solved this issue?

getsanjeevdubey commented 3 years ago

@tagorepothuneedi Has anyone found solution to this?

mmckerns commented 3 years ago

This depends on https://github.com/uqfoundation/dill/issues/348.