Closed ShahriyarR closed 2 years ago
@akuporos @nkogteva please, have a look
I am able to suppress this direct issue with IEPlugin by applying the following changes:
ie_api_supp.py
file alongside ie_api.pyx
file:
from ie_api import IEPlugin
def rebuild_ieplugin(device, plugin_dirs): return IEPlugin(device, plugin_dirs)
* Then I have edited toe IEPlugin class and added `__reduce__` method there as:
def reduce(self): device_name = bytes(self.impl.device_name) device_name = to_py_string(device_name) return (rebuild_ieplugin, (device_name,))
But now we have a new error:
(Pdb) n Traceback (most recent call last): File "/usr/lib/python3.6/multiprocessing/queues.py", line 234, in _feed obj = _ForkingPickler.dumps(obj) File "/usr/lib/python3.6/multiprocessing/reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) File "stringsource", line 2, in openvino.inference_engine.ie_api.ExecutableNetwork.__reduce_cython__ TypeError: self.ie_core_impl,self.impl,self.plugin_impl cannot be converted to a Python object for pickling
Hello @ShahriyarR , Which Cython version do you use?
Hi @akuporos
# pip3 freeze | grep -i cython
Cython==0.29.20
Btw, let me clarify a bit what I did in order to get rid of first error.
@akuporos
I was able to eliminate the recent error with ExecutableNetwork
by adding __reduce__
to IECore, ExecutableNetwork, and InferRequest class.
Supplementary Python file ie_api_supp.py
now looks as:
from ie_api import IEPlugin
from ie_api import IECore
from ie_api import ExecutableNetwork
from ie_api import InferRequest
def rebuild_ieplugin(device, plugin_dirs):
return IEPlugin(device, plugin_dirs)
def rebuild_IECore(xml_config_file):
return IECore(xml_config_file)
def rebuild_ExecutableNetwork():
return ExecutableNetwork()
def rebuild_InferRequest():
return InferRequest()
Respective changes:
from ie_api_supp import rebuild_IECore
cdef class IECore:
## Class constructor
# @param xml_config_file: A full path to `.xml` file containing plugins configuration.
# If the parameter is not specified, the default configuration is handled automatically.
# @return Instance of IECore class
def __cinit__(self, xml_config_file: str = ""):
self.xml_config_file = xml_config_file
self.impl = C.IECore(xml_config_file.encode())
def __reduce__(self):
return (rebuild_IECore, (self.xml_config_file,))
from ie_api_supp import rebuild_ExecutableNetwork
cdef class ExecutableNetwork:
## There is no explicit class constructor. To make a valid instance of `ExecutableNetwork`,
# use `load()` method of the `IEPlugin` class.
def __init__(self):
self._infer_requests = []
def __reduce__(self):
return (rebuild_ExecutableNetwork, ())
from ie_api_supp import rebuild_InferRequest
cdef class InferRequest:
## There is no explicit class constructor. To make a valid `InferRequest` instance, use `load_network()`
# method of the `IECore` class with specified number of requests to get `ExecutableNetwork` instance
# which stores infer requests.
def __init__(self):
self._inputs_list = []
self._outputs_list = []
self._py_callback = lambda *args, **kwargs: None
self._py_callback_used = False
self._py_callback_called = threading.Event()
self._py_data = None
def __reduce__(self):
return (rebuild_InferRequest, ())
So all story is about making those things Pickleable.
But of course, it will be great to have something official.
As my solution is not stable/well enough I got an extra error:
Traceback (most recent call last):
File "run.py", line 34, in <module>
main()
File "run.py", line 30, in main
multi_cam_start_point(multi_cam_args)
File "/opt/tor/main/multi_cam_async/multi_camera_face_detection_async.py", line 80, in main
loop.run_until_complete(async_main(capture_obj, things, face_detect_obj))
File "/usr/lib/python3.6/asyncio/base_events.py", line 484, in run_until_complete
return future.result()
File "/opt/tor/main/multi_cam_async/multi_camera_face_detection_async.py", line 55, in async_main
await loop.run_in_executor(pool, func_run_fd_draw_all)
concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.
The same code works well with ThreadPoolExecutor
as expected.
@akuporos any further suggestions on it?) Is it worth to upgrade my current OpenVino? - is it possible to run inference in multiprocess manner in new versions?
@ShahriyarR Were you able to fix this issue?
@princethewinner Nope, I have dropped the idea of using multiprocessing/ProcessPoolExecutor as it is initially not supported by OpenVino.
Closing, please re-open if assistance is still needed.
Using:
Basically, I am trying to run Face Detection demo app - the IE in a multiprocess manner but got:
This is a common issue with Python Pickle + Cython runs as the object should be pickleable. I am curious if this issue already fixed on new 2020.2 or 2020.3 releases.