Open valeeum opened 3 weeks ago
For anyone stumbled in this issue,
while waiting the #2617 to merge,
I have a simple hack using existing redis connection to store the repeatJobKey
:
const job = queue.add('name', data, {
repeat: {
// repeat opts,
}
})
const ok = await conn.set("repeatable-job-id", job.repeatJobKey)
and when you want to remove the job, query the key and removeRepeatableByKey
const key = await conn.get("repeatable-job-id")
if(key) await queue.removeRepeatableByKey(key)
const ok = await conn.del("repeatable-job-id")
I don't know if this hack is truly safe, but I'm using it right now, and I haven't found a problem yet.
@doaortu
This is not safe because based on repeat options, you can have multiple jobs for the same job id.
Is your feature request related to a problem? Please describe. I'm in the process of creating middleware for a microservices framework which allows services to run on a given schedule via BullMQ. The repeat options are provided in the service metadata as JSON configuration and the repeatable job is created when the service broker is initialized via the middleware. The issue I'm facing is that if the repeat job options are changed and service restarted, it would create two separate repeatable jobs.
Describe the solution you'd like 1) Create a removeRepeatableByJobID method which would remove all instances of repeatable jobs by jobID (which i understand is not unique for repeatable jobs) 2) Add optional upsert-like capability when adding repeatable jobs so that if an existing job is already defined (by jobId), it gets overwritten with the new repeat job options.
Describe alternatives you've considered I understand that I can create a database table and store the repeatJobKey provided after calling the queue add() method and then use that value to call removeRepeatableByKey but it seems overkill for this purpose.