lykmapipo / kue-scheduler

A job scheduler utility for kue, backed by redis and built for node.js
246 stars 47 forks source link

Allow unique jobs to not reuse the same job, and update the unique data at schedule #75

Open gmcnaught opened 7 years ago

gmcnaught commented 7 years ago

Currently unique scheduled jobs re-use the same job id to enforce a single running job.

I am suggesting in the location that we are using the current redlock on scheduling of unique jobs as an opportunity to create a new job, if the previous job is not running.

This means you have logs of job history, previous job data / results across the multiple job ids.

I believe it is also a requirement if you want to be able to update job data between runs on unique jobs, as the job data is stored on the unique job data.

The downside: it does mean you would have to use Kue's cleanup of processed jobs to keep from running out of space on your redis instance, which while supported is a change to the behavior of kue-scheduler currently.

lykmapipo commented 7 years ago

Hello @gmcnaught so we need to have or improve history of every job results if i get you correctly.

The way its implemented so far is like this.

  1. every unique job - Use this if you want to maintain a single instance of a job and run history/result are well know e.g fetching RSS Feed of a particular channel

  2. every non-unique job - Use this if you want to maintain multiple instance of a job and run history/result are not well known and are of importance. e.g sending SMS or Email

So there are scenario you may want to maintain run history for every run and there are scenario that they are of no importance.

One last thing, kue-scheduler is just a layer on top of kue and all job management are passed to kue once job instance created.

Hope we can start from here.

schmod commented 7 years ago

I use unique jobs to ensure that there is only a single instance of a particular job type.

However, I potentially might want to know more information about the runtime/logs of every instance of that job.

For instance, I would never want to schedule more than one "Generate Daily Report" job, but I might not necessarily want to reuse the same Job ID every day.

adampatarino commented 7 years ago

I could see this being useful if you want to run the job once outside of its normal schedule, but maintain the results in the log of the overall instance. #95

lykmapipo commented 7 years ago

@adampatarino, @schmod & @gmcnaught

So the feature required is:

Ability keep history of the job running especially a job scheduled to run every after specific time interval. If son this can be implemented, if kue can not store we can see if we can store on redis ourselves.

I have come across the same scenarion and based on the project am working on hopely i will refactor the history part and move it to kue-scheduler.

Hope it will help.

bilby91 commented 6 years ago

Hello! First of all thanks for building this project!

We are starting the use the project in a new application and we are wondering what is correct way of updating the cron information of a unique job? At the moment, if we reschedule a job with the same id but we change the cron data, this doesn't seem to affect it. Is there any way to do that ? If not, what changes need to be done ? I can submit a PR

vjustov commented 6 years ago

I just ran into this issue as well. having the exact scenario as @schmod . Is there any progress on this?

Also i dont know if it is related, but when hooking a complete event to the 'unique every' job the event is only fired the first time.