Open davidbrochart opened 1 year ago
I'm not sure I understand how this package can help.
Also I don't see why the loop needs to be re-entrant, given that it is possible and legal to start additional loops on other threads. Below I took one of your examples posted elsewhere and modified to do this:
import asyncio
from threading import Thread
def reentrant_asyncio_run(coro):
ret = None
def _run(coro):
ret = asyncio.run(coro)
return ret
t = Thread(target=_run, args=(coro,))
t.start()
t.join()
return ret
async def task(name):
for i in range(10):
await asyncio.sleep(0.1)
print(f"from {name}: {i}")
async def bar():
asyncio.create_task(task("bar"))
await asyncio.sleep(1.1)
print("bar done")
def foo():
# asyncio.run inside an already running event loop
# pre-empts the execution of any other task in the event loop
reentrant_asyncio_run(bar())
async def main():
t = asyncio.create_task(task("main")) # not executed until foo() is done
foo()
await asyncio.sleep(1.1) # t resumes execution
asyncio.run(main())
I'm not sure I understand how this package can help.
Maybe it cannot, but it allows running async code from a sync function, so I thought it could detect that it's already running in an event loop, and in that case do some greenlet magic to await the async function in the sync function. But I'm probably missing something.
Also I don't see why the loop needs to be re-entrant, given that it is possible and legal to start additional loops on other threads.
Sure, and that's our current solution in Jupyter. But threads are not as lightweight as coroutines, and some libraries don't like it when they don't run in the main thread. Also, for asyncio objects like Event
to be used in the async and sync code, they must belong to the same event loop.
But I'm probably missing something.
Yes, I think what you are missing is that in spite of the greenlet stuff happening in the background, the asyncio loop runs without any hacks or modifications. A sync function that is relocated to a greenlet via the async_()
function or decorator effectively becomes an async function that can interact with native async functions directly. Running async code from a sync function is definitely the goal of this package, but I don't see how tasks that were started before the sync function can be prevented from running until the sync function returns.
But threads are not as lightweight as coroutines
How many levels of loops inside loops do you need? For a handful of them, using threads should not have any performance or resource consumption impact.
Also, for asyncio objects like Event to be used in the async and sync code, they must belong to the same event loop.
So sharing concurrency primitives is a requirement? What's the use case for this requirement? And how is this going to work, given that only the most inner loop is active? and all the others are blocked so they cannot trigger or receive notifications? Feels like a recipe for deadlocks.
Threads might be just fine, but in theory they can still be problematic with some libraries or even some platforms like the browser running Python in WASM where they don't exist. I'm sorry I can't provide any real use-case, but these are just some potential issues that make me look for a better solution.
Same for sharing concurrency primitives, where it used to be possible when we were using nest-asyncio (but I admit it was more "magical" since asyncio.run
was not even blocking).
I think you will need to convince the Python core team to implement this in Python, because doing this without threads is just not possible right now.
FWIW this is possible with the greenback package:
import asyncio
import greenback
async def async_function():
await asyncio.sleep(1)
def sync_function():
greenback.await_(async_function())
async def main():
await greenback.ensure_portal()
sync_function()
asyncio.run(main())
I am wondering if
greenletio
could be used to implement nested event loop. It seems it's currently not possible:But do you see a fundamental reason why it could not work?