python-arq / arq

Fast job queuing and RPC in python with asyncio and redis.
https://arq-docs.helpmanual.io/
MIT License
2.1k stars 173 forks source link

How can I prevent cron from starting a new task if the previous one is still running? #430

Closed Soures888 closed 7 months ago

Soures888 commented 7 months ago
async def download_content(ctx):
    await asyncio.sleep(12)
    loguru.logger.debug('Job started, download content')

def run_by_seconds_interval(seconds: int):
    return {i for i in range(0, 60, seconds)}

class WorkerSettings:
    cron_jobs = [
        cron(download_content, minute=None, second=run_by_seconds_interval(3), run_at_startup=False),
    ]
    functions = [download_content]

How can I prevent starting a new task if the latest one hasn't finished? For example, in my production environment, I have a function that needs to run every 15 seconds. In some cases, one of these runs can take longer than 30 seconds, and I don't want this function to double-execute.

JonasKs commented 7 months ago

Hi,

It's documented here: https://arq-docs.helpmanual.io/#job-uniqueness

Soures888 commented 7 months ago

Hi @JonasKs, thank you for your answer. I have tested this, and it does not work for me. The issue with creating cron jobs using a unique ID is that if a job has already finished, the executor will not start it again because a job with the same ID already exists.