jgehrcke / gipc

gevent-cooperative child processes and inter-process communication
https://gehrcke.de/gipc
MIT License
83 stars 13 forks source link

OSError with another daemon process #15

Closed jgehrcke closed 7 years ago

jgehrcke commented 9 years ago

Originally reported by: Heungsub Lee (Bitbucket: sublee, GitHub: sublee)


I met OSError when a daemon process started at before starting a GIPC process and I joined the GIPC process.

d = multiprocessing.Process(target=f)
d.daemon = True
d.start()
p = gipc.start_process(f)
p.join()

Error output looks like:

Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/atexit.py", line 24, in _run_exitfuncs
    func(*targs, **kargs)
  File "/usr/lib/python2.7/multiprocessing/util.py", line 321, in _exit_function
    p._popen.terminate()
  File "/usr/lib/python2.7/multiprocessing/forking.py", line 172, in terminate
    os.kill(self.pid, signal.SIGTERM)
OSError: [Errno 3] No such process
Error in sys.exitfunc:
Traceback (most recent call last):
  File "/usr/lib/python2.7/atexit.py", line 24, in _run_exitfuncs
    func(*targs, **kargs)
  File "/usr/lib/python2.7/multiprocessing/util.py", line 321, in _exit_function
    p._popen.terminate()
  File "/usr/lib/python2.7/multiprocessing/forking.py", line 172, in terminate
    os.kill(self.pid, signal.SIGTERM)
OSError: [Errno 3] No such process

jgehrcke commented 9 years ago

Original comment by Heungsub Lee (Bitbucket: sublee, GitHub: sublee):


Oh, sorry. I had forgotten to write a comment. As your advice, just removing multiprocessing callbacks fixed the issue. Thank you.

jgehrcke commented 9 years ago

Original comment by Jan-Philip Gehrcke (Bitbucket: jgehrcke, GitHub: jgehrcke):


Heungsub, did my last comment help you resolving the issue you were observing?

jgehrcke commented 9 years ago

Original comment by Jan-Philip Gehrcke (Bitbucket: jgehrcke, GitHub: jgehrcke):


Hey, thanks for the fast feedback. Before we go into the hassle of finding a workaround for cov_core and gipc in your situation, I want to clarify what I tried to say before.

I still assume that you are using gevent, because otherwise you would not be in need of using gipc. Am I right that your application depends on gevent? If yes: let us forget about gipc for a second. If your testing setup involves gevent and multiprocessing, and especially the forking part of multiprocessing, I can almost assure you that you will at some point be bitten by issues stemming from the pure multiprocessing-gevent combination.

I can only recommend to not combine multiprocessing and gevent directly. gipc exists to help you with that.

gipc will never officially support to be used in the same application together with vanilla multiprocessing, especially because multiprocessing might call os.waitpid() in various places, which usually races with libev's/gipc's SIGCHLD signal watcher-based child process monitoring.

Having said this, I think what you want should be achievable. I myself use pytest-cov and therefore cov-core in order to test the coverage of the gipc unit tests. The usage of multiprocessing in cov-core is limited to installing a fork handler (there are no processes spawned). So, whether there can be a workaround for your situation and how complex it might be entirely depends on the specifics of your use case. I think it could be doable. You might want to show a minimal working application/setup that clarifies your use case.

An important insight, I guess, is that the entire functionality in https://github.com/schlamar/cov-core/blob/master/cov_core.py in lines 10 to 29 is optional.

jgehrcke commented 9 years ago

Original comment by Heungsub Lee (Bitbucket: sublee, GitHub: sublee):


I don't import multiprocessing directly in my project but cov_core, a third-party library uses multiprocessing. Isn't there a workaround to use cov_core with gipc?

jgehrcke commented 9 years ago

Original comment by Jan-Philip Gehrcke (Bitbucket: jgehrcke, GitHub: jgehrcke):


Sorry for taking so long to get back to you. In your example you use both, multiprocessing and gipc, doh. I thought it might be obvious that you should not mix multiprocessing with gipc. Especially, you should not blindly use multiprocessing with gevent. And I assume that you use gevent because you seem to use gipc. Replace your initial multiprocessing.Process() call with the corresponding gipc-based call, like in this example:

import gipc
import gevent

def f():
    gevent.sleep(5)

p1 = gipc.start_process(target=f, daemon=True)
p2 = gipc.start_process(target=f)
p1.join()
p2.join()

This executes just fine. Does this answer solve your problem?