It would be great to have support for async scheduling of tasks.
Obviously, support for Python asyncio is already the list, and full support will need a lot of work. Perhaps a roadmap or TODO list with items that people can work on would help?
Worker - not essential?
The worker runs already parallel, starting a loop is trivially possible with a custom Task class and often enough, things done in a task may not lend themselves to easy asyncio-ization. So personally, I think this is less critical. If desired, I can provide a PR to allow
Blocking calls on the client side, however, are more cumbersome. When scheduling tasks from e.g. an aiohttp web server, calling some_task.async_apply() blocks and naive calls to get() also block. Not sure if using Executors helps - for now I myself have carefully implemented things so that I never ever wait for a result of a task on the client side.
Design
I don't understand the internals of Celery enough for this. Perhaps like so:
Implement an Executor based wrapper for backends
Add co-routines to interface that use executor for async task scheduling and result checking
Add internal loop management for sync calls
Rewrite backend by backend to become asyncio capable
Code churn is likely preferable over a big rewrite that will never finish. So adding an asyncio API by just using threads/processes internally and iteratively working out the internal details sounds like an approach that might also allow sharing the work with more "stakeholders"?
Architectural Considerations
Lots...
Proposed Behavior
It would be a great first step if it was possible to await task results from an asyncio loop. At some future point, it would be great if celery was async native not only in the coding style the tasks demand, but also in the way it works internally.
Proposed UI/UX
task = await some_task.s(arg).awaitable_async_apply()
res = await task.awaitable_get()
res = await.some_task.s(arg).awaitable_apply()
It's a bit tough that async_apply() is already taken, having a hard time coming up with good names as the usual approach for sync/async alternate API is to prefix with async_.
Checklist
Related Issues and Possible Duplicates
Related Issues
Possible Duplicates
3883, ...
Related Projects:
Brief Summary
It would be great to have support for async scheduling of tasks.
Obviously, support for Python
asyncio
is already the list, and full support will need a lot of work. Perhaps a roadmap or TODO list with items that people can work on would help?Worker - not essential?
The worker runs already parallel, starting a loop is trivially possible with a custom Task class and often enough, things done in a task may not lend themselves to easy
asyncio
-ization. So personally, I think this is less critical. If desired, I can provide a PR to allowClient - start with this?
Blocking calls on the client side, however, are more cumbersome. When scheduling tasks from e.g. an
aiohttp
web server, callingsome_task.async_apply()
blocks and naive calls toget()
also block. Not sure if usingExecutor
s helps - for now I myself have carefully implemented things so that I never ever wait for a result of a task on the client side.Design
I don't understand the internals of Celery enough for this. Perhaps like so:
Executor
based wrapper for backendsCode churn is likely preferable over a big rewrite that will never finish. So adding an asyncio API by just using threads/processes internally and iteratively working out the internal details sounds like an approach that might also allow sharing the work with more "stakeholders"?
Architectural Considerations
Lots...
Proposed Behavior
It would be a great first step if it was possible to
await
task results from an asyncio loop. At some future point, it would be great if celery was async native not only in the coding style the tasks demand, but also in the way it works internally.Proposed UI/UX
It's a bit tough that
async_apply()
is already taken, having a hard time coming up with good names as the usual approach for sync/async alternate API is to prefix withasync_
.Diagrams
N/A
Alternatives
None