datitran / object_detector_app

Real-Time Object Recognition App with Tensorflow and OpenCV
https://medium.com/towards-data-science/building-a-real-time-object-recognition-app-with-tensorflow-and-opencv-b7a2b4ebdc32
MIT License
1.3k stars 747 forks source link

Error while running multithreading (object_detection_multilayer.py) #11

Closed sameermalik123 closed 7 years ago

sameermalik123 commented 7 years ago

PicklingError Traceback (most recent call last)

in () 153 child_process.daemon = False 154 --> 155 main_process.start() 156 child_process.start() 157 ~\Anaconda2\envs\tensorflow\lib\multiprocessing\process.py in start(self) 103 'daemonic processes are not allowed to have children' 104 _cleanup() --> 105 self._popen = self._Popen(self) 106 self._sentinel = self._popen.sentinel 107 _children.add(self) ~\Anaconda2\envs\tensorflow\lib\multiprocessing\context.py in _Popen(process_obj) 210 @staticmethod 211 def _Popen(process_obj): --> 212 return _default_context.get_context().Process._Popen(process_obj) 213 214 class DefaultContext(BaseContext): ~\Anaconda2\envs\tensorflow\lib\multiprocessing\context.py in _Popen(process_obj) 311 def _Popen(process_obj): 312 from .popen_spawn_win32 import Popen --> 313 return Popen(process_obj) 314 315 class SpawnContext(BaseContext): ~\Anaconda2\envs\tensorflow\lib\multiprocessing\popen_spawn_win32.py in __init__(self, process_obj) 64 try: 65 reduction.dump(prep_data, to_child) ---> 66 reduction.dump(process_obj, to_child) 67 finally: 68 context.set_spawning_popen(None) ~\Anaconda2\envs\tensorflow\lib\multiprocessing\reduction.py in dump(obj, file, protocol) 57 def dump(obj, file, protocol=None): 58 '''Replacement for pickle.dump() using ForkingPickler.''' ---> 59 ForkingPickler(file, protocol).dump(obj) 60 61 # PicklingError: Can't pickle : it's not the same object as __main__.main_process I am running this in a virtual environment with each required dependencies ,any information will be helpful
datitran commented 7 years ago

Hmm it seems that the main function can't be serialized. Not sure why this is the case. Maybe this due to the virtual environment. But usually when we use the Pool class, it uses queue.Queue to pass tasks to the worker process. And everything that goes through the queue must also be pickable/serializable.

sameermalik123 commented 7 years ago

What could be done regarding this, i read all documentation regarding pool and proces but problem still persists .Thanks for ur support

datitran commented 7 years ago

Have you tried to use it on a normal machine yet instead of a virtual environment? I would try to isolate the problem and see where it works and where not...

sameermalik123 commented 7 years ago

I was using it in jupyter notebook ,i copied the above code ,and test run through commandline in default .py format and it now started to work ,but it is barely detecting anything.I don't think there was any problem due to virtual environment , i am still running it in virtual environment. But i am still unable to understand why it dont work in ipython notebook.

datitran commented 7 years ago

Hmm sorry I can't help you much here as well as I don't know how to replicate the error...