Open Cally99 opened 4 years ago
As per documentation there default path location does not work. I created a file myself and copied the relative path to 'location' :'backend/project/celery_once'
Tasks are firing now:)
Please check if documentation needs to be updated or default tmp file is not being created. By the way I'm using docker
celery-once should create all necessary parent directories and the lock file itself automatically. Yes, it may fail to do that due to permission problem or something else, but I think it's not what is happening in your case. The No such file or directory
error in your log is thrown when celery-once tries to remove the lock file after the completion of the task. It's like the lock file has been removed already, and celery-once tries to remove it again.
@Cally99 Your /tmp/
will be cleared each time you start/stop the container. Hence, the locks will not be ensured as the directory, they are stored in does not persist between worker restarts.
I'm not sure if the FileNotFoundError
is the best error to raise here, but there should be some sort of warning (or exception) to the user to indicate the lock has failed.
@cameronmaske But according to the provided traceback, the FileNotFoundError
was thrown on the lock removal. Do you have any idea why this might happen? I'm seeing this in my setup too, albeit very rarely.
My theory is that it's due to the task still being in the broker (i.e. unprocessed) but the lock file is removed before the task is run. The lock is created before the task is sent into the broker.
If we break down the timeline, the following happens in order:
/tmp/
is cleared some other way.Yeah, that makes sense. Redis ignores the missing keys when DEL command is executed. So I think we should do the same with the file backend.
I have a nearly identical setup and also getting the same traceback. I think @cameronmaske 's diagnosis is right on.
My tasks crash and do not run anymore due to a no such file error. I'm using the file based backend and this is my set up.
settings.py
CELERY_ONCE = { 'backend': 'celery_once.backends.File', 'settings': { 'location': '/tmp/celery_once/', 'default_timeout': 60 * 60 } }
CELERY_BROKER_URL = 'pyamqp://rabbitmq:5672' CELERY_RESULT_BACKEND = 'django-db'
CELERYD_HIJACK_ROOT_LOGGER = False
use json format for everything
CELERY_ACCEPT_CONTENT = ['json'] CELERY_TASK_SERIALIZER = 'json' CELERY_TIMEZONE = 'UTC' CELERYBEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
CELERY_ONCE = { 'backend': 'celery_once.backends.File', 'settings': { 'location': '/tmp/celery_once/', 'default_timeout': 60 * 60 } }
celery.py
from future import absolute_import import os
from celery import Celery from django.conf import settings
set the default Django settings module for the 'celery' program.
all = [ 'celery', 'QueueOnce', ] os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'autobets.settings') os.environ.setdefault('DJANGO_CONFIGURATION', 'Development') import configurations
configurations.setup()
app = Celery('autobets')
Using a string here means the worker will not have to
pickle the object when using Windows.
app.config_from_object('django.conf:settings', namespace='CELERY') app.conf.ONCE = settings.CELERY_ONCE app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
tasks.py
@shared_task(bind=True,base=QueueOnce, once={'graceful': True}) def get_events(self): do stuff bla bla bla
Before using celeryonce tasks would run as normal.
Do I need to create a tmp backend file first or does celery once create this?
stack trace.