clarkie / dynogels

DynamoDB data mapper for node.js. Originally forked from https://github.com/ryanfitz/vogels
Other
490 stars 110 forks source link

Add rate limiting support for query and scan. #99

Closed themez closed 7 years ago

themez commented 7 years ago

Use a token bucket to prevent consuming too much capability of throughput. Example:

const stream = BlogPost
  .query('werner@example.com')
  .loadAll()
  .consumeThroughput(10)
  .exec();

stream.on('data', data => {
  console.log('%j', data);
});
coveralls commented 7 years ago

Coverage Status

Coverage decreased (-0.9%) to 97.72% when pulling b9a5513f05436f4260eda3b00bcec8c52556b8d1 on ruguoapp:rate-limit into 6b3f75c123ec1924d70d0ab1178519012c98b079 on clarkie:master.

coveralls commented 7 years ago

Coverage Status

Coverage decreased (-0.9%) to 97.72% when pulling b9a5513f05436f4260eda3b00bcec8c52556b8d1 on ruguoapp:rate-limit into 6b3f75c123ec1924d70d0ab1178519012c98b079 on clarkie:master.

coveralls commented 7 years ago

Coverage Status

Coverage decreased (-0.6%) to 98.095% when pulling 8b7e43bf3de771d4f69b8340891f24ab9213df68 on ruguoapp:rate-limit into 6b3f75c123ec1924d70d0ab1178519012c98b079 on clarkie:master.

coveralls commented 7 years ago

Coverage Status

Coverage decreased (-0.6%) to 98.095% when pulling 30919b239d12ef24bbead944da7b1e7007baedbe on ruguoapp:rate-limit into 6b3f75c123ec1924d70d0ab1178519012c98b079 on clarkie:master.

clarkie commented 7 years ago

Hi @themez, thanks for your contribution. Could you describe a use case for this? How would you expect this to work in an application with a cluster of servers?

themez commented 7 years ago

Hi @clarkie , in my case I often need to query large dataset, think of a feed pusher, say I have several workers concurrently pushing feed item to channel subscribers, they continually query channel subscribers which might be very many,I need to limit workers' total throughput usage.

Normally we use token bucket for rate limiting, but as the consumption of a request is not known util it is finished, it's hard to prevent workers start querying when token is not enough. At first I just add an ability for limiting a single load-all query's request rate, use a simple external rate limiter. You can check the commit history.

But it's not enough in my use case. Then I write my own rate-limiter and add a "pre-auth" mode, a query must do "pre-auth" before sending request, and clear transaction with real consumed amount, just like Pre-Authorization for credit card.

I'm still working on it, it will be appreciated if you can give some suggestions.

themez commented 7 years ago

I thought again, it may not be good to put such a feature in this library, it could be done in upper level applications.