OptimalBits / bull

Premium Queue package for handling distributed jobs and messages in NodeJS.
Other
15.43k stars 1.43k forks source link

Different job weights for the rate limit #1035

Open cesarvarela opened 6 years ago

cesarvarela commented 6 years ago

Hi, I need to limit calls to an api at a maximum of 250/second, and I configured the rate limiter as such, but sometimes some jobs do more and 1 call to this api, so I need to have some jobs to take more and 1 slot of the 250 available each second (I cannot separate each call in different jobs becase calls must follow an specific order, and I'm processing these calls with a 250 concurrency).

Is there an easy solution for this? The only option that comes to mind is to modify the lua scripts so they modify the rate limiter key in increments depending on the job weight I mention above.

thanks.

manast commented 6 years ago

currently the rate limiter is global for all the jobs in the queue. What you are proposing is some kind of credit system where a job can consume credits and could get rate limited by it, but is is difficult to implement in practice to avoid edge cases, specially if a job does not know how many "credits" it needs to consume are not known before the job started.

cesarvarela commented 6 years ago

Hi, actully (I think) this is simpler than that, because we know the weight of the job before adding it to the queue: the data contains exactly the N number of calls that need to be made, so when a new job is moved to the active queue, instead of consuming 1 "credit", it should consume N credits.

Adding it to the queue should look something like this:

await queue.add({ apicalls }, { weight:  apicalls,length})