nickelser / activejob-traffic_control

Rate limiting/job enabling for ActiveJob using distributed locks in Redis or Memcached.
298 stars 27 forks source link

[Question] Stacking throttle? #4

Open psteininger opened 7 years ago

psteininger commented 7 years ago

I haven't tried it myself, but I am wondering if there is a way to stack throttle definitions. Here is a use case: We have a portion of a system that receives callbacks about incoming calls from provider A, and hangs up the call, and then schedules a job to start a call from provider B. Provider B has a rate limit of 2 requests per second. However, quite often provider A hits us twice with a callback for the same call, 3-15 seconds apart. So I would like to have 2 throttle guards:

  1. Standard 2 per second limit
  2. Parameter based throttle (i.e don't call the same number twice in a 20 second window)

How could I get this accomplished?

nickelser commented 7 years ago

It isn't documented (the other open issue addresses this 😄 https://github.com/nickelser/activejob-traffic_control/issues/3), but you can have the throttle key based on a lambda, so you could programmatically generate a throttle key by the phone number.

So, how I would do it would be something like (untested, but should work):

class NumberCallbackJob < ApplicationJob
  throttle threshold: 2, period: 1.second
  concurrency 1, drop: true, key: -> (job) { job.arguments.first # if the first argument is a phone number e.g. }

  def perform
    # do the stuff
  end
end
nickelser commented 7 years ago

Oh my goodness, I didn't realize you opened both issues! Sorry if that is a repeat, then; let me know if there's an extension that would help with this (the feature request that might make sense would be multi-key handling).