Closed shawnzhu closed 6 years ago
@shawnzhu This is easy to implement as I've done something similar in the official opm package server software:
https://github.com/openresty/opm
Well, we could add a dedicated Lua module to this distribution. Will you propose an API first here to avoid too many back and forth later on?
Example configuration to limit 5000 requests per hour:
http {
lua_shared_dict my_limit_req_store 100m;
server {
location / {
access_by_lua_block {
local limit_req = require "resty.limit.ghe_req"
-- rate: 5000 requests per 3600s
local lim, err = limit_req.new("api_limit_req_store", 5000, 3600)
if not lim then
ngx.log(ngx.ERR, "failed to instantiate a resty.limit.req object: ", err)
return ngx.exit(500)
end
-- use the Authorization header as the limiting key
local key = ngx.req.get_headers()["Authorization"] or "public"
local remaining, reset = lim:incoming(key, true)
ngx.header["X-RateLimit-Limit"] = "5000"
ngx.header["X-RateLimit-Reset"] = reset
if remaining < 0 then
ngx.header["X-RateLimit-Remaining"] = 0
ngx.log(ngx.WARN, "rate limit exceeded")
return ngx.exit(403)
else
ngx.header["X-RateLimit-Remaining"] = remaining
end
}
}
}
}
Maybe this feature could be implemented in the existing resty.limit.req
module, if it is updated to receive and opts
table instead of the current arguments? Something like:
local lim, err = limit_req.new(shm, limit, opts?)
shm: string
limit: number
opts: table
{
period = 1, // the period in seconds, defaults to 1 for limit/s
burst = 0, // the allowed burst for that period
}
// Another acceptable form could probably be:
local lim, err = limit_req.new(shm, limit, burst, period)
I believe it should be possible to easily update the limit_req module and use a configurable period instead of seconds.
The point is that once that is implemented this way, setting a different rate for hours vs minutes or seconds is a matter of creating several of those, and combine them in the existing resty.limit.traffic
module?
Following this assumption, the incoming()
method should probably return the reset
value highlighted in your API proposal @shawnzhu, so that the resty.limit.traffic
combine
method can return it as well, probably as a third argument to maintain backwards compatibility?
delay, err, reset = class.combine(limiters, keys, states?)
-- 'reset' being the time at which all the current configured rates would be reset (aka, the longest reset return value of all the instances currently configured)
@thibaultcha Thanks for the feedback! it's doable by extending existing rest.limit.req
module but I got the impression that the design goal is just a counterpart of the Nginx module ngx_http_limit_req_module
. It would be an API design change for rest.limit.req
to add options like period
plus new return value of incoming()
like reset
. I will 👍🏼 If such API design change plus backward compatibility are acceptable.
I think it's better to be in a separate Lua module since it's quite different from the existing resty.liimt.req
module. But ghe_req
looks confusing. What does it mean? GitHub something? It deserves a better name :)
Yeah, it should be usable by the combine
method of resty.limit.traffic
.
@agentzh I've created https://github.com/openresty/lua-resty-limit-traffic/issues/17 to use a better name openresty.limit.count
will add a new commit to make it usable by resty.limit.traffic
is this feature live?
Sorry, been busy and haven't got the chance to look at this again.
I believe it has been released as part of OpenResty 1.13.6.1?
https://openresty.org/en/changelog-1013006.html
upgraded lua-resty-limit-traffic to 0.05. feature: added new module resty.limit.count for GitHub API style request count limiting. thanks Ke Zhu for the original patch and Ming Wen for the followup tweaks.
Clarified in #29.
Oh, right. This is already merged and released.
I've started some experiment with ngx_http_limit_req_module style request rate limit with
resty.limit.req
and here's the outcomes I've collected:resty.limit.req
module serves status code429
when a request is rejected.I would propose a new module which implements the GitHub API rate limit style request limiting and my experiment shows pretty good outcome.
@agentzh are you interested in this new feature in this repo? I'm glad to create a PR if this is a potentinal improvement.