Open matiasb opened 3 months ago
You are, an absolute legend!
Should it be possible to run the worker via a management command
I think deferring to however Celery runs its tasks even without django-tasks
is the way to go. #7 can deal with that in future
Should it be possible to config a subset of the Celery config through django-tasks?
I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to django-tasks
to find its settings?
As an aside, how viable is making this work without needing a custom app? Perhaps by pointing django-tasks
to an existing app (optionally)?
just wrapping the minimal bits to queue tasks through Celery
I'd say that's absolutely fine! If it supports all the features django-tasks
surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from the app
.
Should it be possible to run the worker via a management command
I think deferring to however Celery runs its tasks even without
django-tasks
is the way to go. #7 can deal with that in future
Sounds good :+1:
Should it be possible to config a subset of the Celery config through django-tasks?
I think it'd be ideal if we can. By the looks of it, you've got a custom celery App with this, which we could point to
django-tasks
to find its settings?
You mean defining a way to configure Celery via the TASKS
setting and make the Celery app get the config from there?
As an aside, how viable is making this work without needing a custom app? Perhaps by pointing
django-tasks
to an existing app (optionally)?
You need to define a Celery app, which will keep the tasks' registry and be the base to start the worker. I'm adding a minimal/default app here which sets the default queue name (as defined by django-tasks
) and that will try to get CELERY_*
config settings from the DJANGO_SETTINGS_MODULE
. You can still define your own app (as documented), and start the worker(s) from there instead (that custom app will be preferred when registering tasks too). If no app is given, Celery will setup a 'default' one, but it gets tricky to make the worker find the registered tasks then.
just wrapping the minimal bits to queue tasks through Celery
I'd say that's absolutely fine! If it supports all the features
django-tasks
surfaces, then I think that's absolutely fine! Surfacing extras can come with time. Especially true if said configuration comes from theapp
.
Makes sense!
My main thinking is around people wanting to slowly use django-tasks
, without needing to rewrite all their Celery integration. If django-tasks
defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.
It'd be great to be able to configure Celery directly through django-tasks
(the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.
My main thinking is around people wanting to slowly use
django-tasks
, without needing to rewrite all their Celery integration. Ifdjango-tasks
defines the app, people can't use their own. Is there a way to achieve that? Even if the default is internal but we let people specify the module path to their own.
People will be able to use their own. If they are already using Celery with Django, they should have one, and they can still use that one (in that case, there will be already a default_app set, and the worker should be run from that app too, which is likely what they were already doing).
In any case, I'm not sure how usual would be migrating from Celery to django-tasks
since that could require multiple changes (depending on the usage of Celery: redecorate tasks, the way they are queued, if you have class-based tasks, etc). I think django-tasks
is great for getting started with background workers, and making it smooth and easy to switch backends depending on your needs.
It'd be great to be able to configure Celery directly through
django-tasks
(the built-in app), but I think we'd need that and custom apps for this to be easy to adopt.
Yeah. Just in case, you can use a custom app as things are now following the Celery docs. If you define your app and make sure it gets loaded when your Django project starts, this will be set as the default app (and then you need to start the worker from this app instead, passing the right path to the celery -A <your-app> worker
command).
I will iterate a few more times on this in the upcoming days (will fix all lint issues, for example :-) will also check any other possible improvements, and will add some tests). Let me know if you have any other thoughts or feedback!
I would like to follow this development....
This is an initial PoC for a Celery backend implementation (still in progress, probably requiring some extra work, besides tests).
How to use it:
Using this
django_tasks
branch in your Django project environment, make sure to also install celery:$ pip install celery
Update your
settings.py
to set the Celery backend:You can also set extra celery config in
settings.py
(otherwise it will just use the default values, which should be ok), by defining aCELERY_*
prefixed setting (e.g. to definebroker_url
, you should add a setting forCELERY_BROKER_URL
)If you don't set a broker URL, the expected one would be a local RabbitMQ. You can run it using docker like this:
$ docker run -d -p 5672:5672 rabbitmq
You shouldn't need to change any
django_tasks
related code in your project.Finally, to run the Celery worker:
$ DJANGO_SETTINGS_MODULE=<your_project.settings> celery -A django_tasks.backends.celery.app worker -l INFO
(this uses a simple default Celery app (seeapp.py
below) pulling config from Django settings; it can be customized per project if needed)Your tasks should now be queued into RabbitMQ and picked/run by the Celery worker. (FWIW, I have been using this in a simple personal project, things seem to work for me so far).
A few items to discuss: