python / asyncio

asyncio historical repository
https://docs.python.org/3/library/asyncio.html
1.03k stars 178 forks source link

How can I have multiple asyncio processes listening on the same port? #481

Closed r0fls closed 7 years ago

r0fls commented 7 years ago

This is related to an issue we're having with sanic: https://github.com/channelcat/sanic/issues/203

Specifically this code isn't working to spawn multiple http workers: https://github.com/channelcat/sanic/blob/master/sanic/sanic.py#L354

Though we start the servers with reuse_port set to true: https://github.com/channelcat/sanic/blob/master/sanic/server.py#L274

It will start multiple processes, but only one of them will respond to web requests. I've tried various approaches and can't get anything to work. Is there an example of doing this, or any information that would be helpful?

r0fls commented 7 years ago

I am aware the correct way to approach this situation is wsgi, which is where we're headed. However I'm curious if it's possible directly.

gvanrossum commented 7 years ago

Note: I've never heard of Sanic and haven't looked at its code. But maybe this comment from the referenced thread is relevant?

I wonder if the issue is that different process can't share same uvloop.

How does Sanic create worker processes? If it just forks and then runs the same code in each, especially if the loop has already been started, that is indeed a problem. Is there a way create the workers first and then create a loop in each one?

1st1 commented 7 years ago

I am aware the correct way to approach this situation is wsgi, which is where we're headed. However I'm curious if it's possible directly.

I don't think anything async should ever use WSGI stack. So it's something you can try, but it's definitely not "the correct" way.

I know that Sanic uses uvloop by default, so my question is: have you tried both asyncio and asyncio+uvloop with reuse_port=True?

Also, there are other ways how you can share a socket between multiple processes. One way is to manually create a socket in the main process, bind it, set it to be inheritable, and then fork (either manually, or using multiprocessing). Then, in subprocesses, you can start asyncio event loop and call loop.create_server(sock=inherited_sock).

r0fls commented 7 years ago

If it just forks and then runs the same code in each, especially if the loop has already been started, that is indeed a problem. Is there a way create the workers first and then create a loop in each one?

This does look like what is happening...

have you tried both asyncio and asyncio+uvloop with reuse_port=True

Yes, I've tried removing uvloop and using only asyncio, which is why I opened the issue here.

r0fls commented 7 years ago

I got this working with the method suggested by @1st1, thank you both.