Closed xxxazxxx closed 10 years ago
For per-IP rate limiting you need to keep track of a token bucket for every IP address. Fortunately the token bucket object takes up very little memory, but to keep things from getting out of hand you will still likely need to use some form of cache for the rate limiters. I recommend a cache that expires objects which have not been accessed in some period of time (see https://github.com/ptarjan/node-cache as an example).
A full implementation of what you're asking for is out of scope of this project, sorry about that. node-rate-limiter just provides the lower level rate limiting mechanism.
I followed your advice and combined your limiter with the node-cache:
var RateLimiter = require('limiter').RateLimiter;
var cache = require('memory-cache');
exports.isLimited = function(req, res, next) {
if (cache.get(req.ip)){
var cachedLimiter = cache.get(req.ip);
if (cachedLimiter.getTokensRemaining()>1){
cachedLimiter.removeTokens(1, function(){});
cache.put(req.ip, cachedLimiter, 10000);
return next();
}
res.send("too many requests from your ip!");
}
else {var cachedLimiter = new RateLimiter(500, 'hour'); cache.put(req.ip, cachedLimiter, 10000); return next();}
}
Now specific requests can be limited on a per-IP basis from the express-app like this:
var limiter = require('./controllers/limiter');
app.get('/isEmailAvailable/:email', limiter.isLimited, userController.isEmailAvaliable);
Just if anyone would come across this issue in the future! :)
@eirikfro looks great, thanks for the example code! You should be able to change the remaining tokens check to if (cachedLimiter.getTokensRemaining()>0){
and allow users to go down to zero available requests, but otherwise that seems solid.
I'm going to close this issue out now, feel free to reopen if needed.
This is the answer I was looking for. I would of never found it had I not checked the closed issues. I was about to request this as a feature myself. To the author of this repo, I would like to request that you add the above snippet to the README for others looking for this answer. I was actually about to turn away from this project until I saw this issue. Thanks to @eirikfro for sharing the above snippet!
Hi @eirikfro,
I tried using your code
checkBy = req.headers.app_id;
rate = 416;
period = 'hour';
waitFor = 10000;
checkRateLimit(checkBy, rate, period);
checkRateLimit = function(checkBy, rate, period)
{
var RateLimiter = Npm.require('limiter').RateLimiter;
var cache = Npm.require('memory-cache');
if (cache.get(checkBy)){
var cachedLimiter = cache.get(checkBy);
if (cachedLimiter.getTokensRemaining()>1){
cachedLimiter.removeTokens(1, function(){});
cache.put(checkBy, cachedLimiter, waitFor);
return;
}
console.log("error");
throw new Error('Too many requests for the app_id: ' + checkBy + '!');
}
else {
var cachedLimiter = new RateLimiter(rate, period);
cache.put(checkBy, cachedLimiter, waitFor);
return;
}
}
but when I make a while cycle for 500 requests they all get passed. Could you tell me what am I doing wrong, also how should I use the waitFor parameter in the cache.put function because I want to lock the user requests for an hour if they exceed this period but if I use waitFor=360000
it still let's me pass all requests??
Thanks in advance.
@boboci9 It's kind of hard to diagnose your problem without seeing how you're calling checkRateLimit
.
From what you've explained; It sound's like you're running into the typical async problem. If you're in a while
loop, then the limiter never has a chance to reduce itself before the next iteration runs. You have two choices to fix your code. You need to wait for this method to return before continuing onto the next loop iteration or use the synchronous method provided.
The issue:
for (...) {
checkRateLimit(...)
}
That won't work how you think it does. Because as previously stated, the method is called before the async function has had time to complete. You need to set up some sort of flow control.
This is the offending line, in case you're wondering;
cachedLimiter.removeTokens(1, function(){});
The synchronous way to implement this method would be to use:
tryRemoveTokens
For example:
if (cachedLimiter.getTokensRemaining()>1){
cachedLimiter.tryRemoveTokens(1);
cache.put(checkBy, cachedLimiter, waitFor);
return;
}
As @bugs181 mention, this seems like javascript async problems. Javascript is great, but might seem very strange if you are used to other languages where code is fired consecutively. :) I think @bugs181 solution will help you.
Hi guys thanks for the answers, my calls are actually API calls so my while cycle is calling API calls, I thought that should work. I will try it with the tryRemoveTokens, but my gratest issue was that if I set a rate of 10 calls per hour everything worked after the 10th it stopped but if I set 10 calls per second than are the issues presenting and it will let me call all the 30 API calls that I have in the while.
Limits = 40/day Once 40 requests are served irrespective of the IP Addresses, It redirects to the 429 view. So even the visitor whose visiting the site for the first time, is redirected to 429. Can we have rate limiting for every IP address?