RealOrangeOne / django-tasks

A reference implementation and backport of background workers and tasks in Django
https://pypi.org/project/django-tasks/
BSD 3-Clause "New" or "Revised" License
314 stars 22 forks source link

Add initial support for a Celery backend #64

Open matiasb opened 3 months ago

matiasb commented 3 months ago

This is an initial PoC for a Celery backend implementation (still in progress, probably requiring some extra work, besides tests).

How to use it:

Your tasks should now be queued into RabbitMQ and picked/run by the Celery worker. (FWIW, I have been using this in a simple personal project, things seem to work for me so far).

A few items to discuss:

RealOrangeOne commented 3 months ago

You are, an absolute legend!

Should it be possible to run the worker via a management command

I think deferring to however Celery runs its tasks even without django-tasks is the way to go. #7 can deal with that in future

Should it be possible to config a subset of the Celery config through django-tasks?

I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to django-tasks to find its settings?

As an aside, how viable is making this work without needing a custom app? Perhaps by pointing django-tasks to an existing app (optionally)?

just wrapping the minimal bits to queue tasks through Celery

I'd say that's absolutely fine! If it supports all the features django-tasks surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from the app.

matiasb commented 3 months ago

Should it be possible to run the worker via a management command

I think deferring to however Celery runs its tasks even without django-tasks is the way to go. #7 can deal with that in future

Sounds good :+1:

Should it be possible to config a subset of the Celery config through django-tasks?

I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to django-tasks to find its settings?

You mean defining a way to configure Celery via the TASKS setting and make the Celery app get the config from there?

As an aside, how viable is making this work without needing a custom app? Perhaps by pointing django-tasks to an existing app (optionally)?

You need to define a Celery app, which will keep the tasks' registry and be the base to start the worker. I'm adding a minimal/default app here which sets the default queue name (as defined by django-tasks) and that will try to get CELERY_* config settings from the DJANGO_SETTINGS_MODULE. You can still define your own app (as documented), and start the worker(s) from there instead (that custom app will be preferred when registering tasks too). If no app is given, Celery will setup a 'default' one, but it gets tricky to make the worker find the registered tasks then.

just wrapping the minimal bits to queue tasks through Celery

I'd say that's absolutely fine! If it supports all the features django-tasks surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from the app.

Makes sense!

RealOrangeOne commented 3 months ago

My main thinking is around people wanting to slowly use django-tasks, without needing to rewrite all their Celery integration. If django-tasks defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.

It'd be great to be able to configure Celery directly through django-tasks (the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.

matiasb commented 3 months ago

My main thinking is around people wanting to slowly use django-tasks, without needing to rewrite all their Celery integration. If django-tasks defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.

People will be able to use their own. If they are already using Celery with Django, they should have one, and they can still use that one (in that case, there will be already a default_app set, and the worker should be run from that app too, which is likely what they were already doing). In any case, I'm not sure how usual would be migrating from Celery to django-tasks since that could require multiple changes (depending on the usage of Celery: redecorate tasks, the way they are queued, if you have class-based tasks, etc). I think django-tasks is great for getting started with background workers, and making it smooth and easy to switch backends depending on your needs.

It'd be great to be able to configure Celery directly through django-tasks (the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.

Yeah. Just in case, you can use a custom app as things are now following the Celery docs. If you define your app and make sure it gets loaded when your Django project starts, this will be set as the default app (and then you need to start the worker from this app instead, passing the right path to the celery -A <your-app> worker command).

I will iterate a few more times on this in the upcoming days (will fix all lint issues, for example :-) will also check any other possible improvements, and will add some tests). Let me know if you have any other thoughts or feedback!

auvipy commented 2 months ago

I would like to follow this development....