Open FeralRobot opened 3 years ago
An update here - there is still a problem in my example here when starting up django on an empty database, where I'm getting the following table missing error:
celery-beat | psycopg2.errors.UndefinedTable: relation "customers_client" does not exist
celery-beat | LINE 1: ...."on_trial", "customers_client"."created_on" FROM "customers...
I must have started up without the generate_beat_schedule functionality in celery.py, which established the customers_clients table in the database, then made the change to add generate_beat_schedule, rebuilding the docker setup while preserving the database container.
I went back and confirmed the sequence theory above. Starting from scratch in docker, if I comment out
app.conf.beat_schedule = generate_beat_schedule(
{
"celery.backend_cleanup": {
"task": "celery.backend_cleanup",
"schedule": crontab("0", "4", "*"),
"options": {"expire_seconds": 12 * 3600},
"tenancy_options": {
"public": True,
"all_tenants": True,
"use_tenant_timezone": True,
}
}
}
)
Docker builds and comes up. Then do docker-compose down, remove the comments around generate_beat_schedule, docker-compose build, and docker-compose up - where the database data is preserved, then everything comes up.
Of course, this is a super ugly workaround.
Any suggestions about what to change above to have one celery.py file that works from an empty database and also works on an established database?
This issue was due to my setup which had the following in the app __init__.py
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
The contents of this file were deleted and the celery.py file had the absolute import below added. Starts up on an empty database successfully with this modification.
from __future__ import absolute_import
import os
from celery.schedules import crontab
from tenant_schemas_celery.app import CeleryApp as TenantAwareCeleryApp
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'kbdj.settings')
# must follow settings configuration as gen_beat_sch references settings
from django_tenants_celery_beat.utils import generate_beat_schedule
Nice package. This issue adds some more information to your celery.backend_cleanup example. There were some additional steps to get the example to work. Consider these as possible documentation additions.
First - where does the example code go: polls/polls/celery.py
Second - you have to import crontab from somewhere. Eventually you will find you should import from celery.schedules
from celery.schedules import crontab
Third - to use generate_beat_schedule you will have to import it from django_tenants_celery_beat.utils , however this import calls the project's settings environment, so you need to populate from settings before you import generate_beat_schedule. Something like this:
Finally, perhaps obvious to others (not me), but you need to have the beat_schedule config info after autodiscover_tasks()
Full example: