fastify / fastify-rate-limit

A low overhead rate limiter for your routes
MIT License
505 stars 72 forks source link

use toad-cache, fix bugs, simplify the code #320

Closed gurgunday closed 1 year ago

gurgunday commented 1 year ago

Rate limiting is a crucial part of a web application, so we must constantly strive to improve fastify-rate-limit's performance and architecture

Before doing some more significant improvements, I wanted to simplify the codebase so that future changes don't get mixed up with these

This PR results in some performance gain thanks to the following changes:

use toad-cache

Toad cache is faster than tiny-lru in important benchmarks and I've shared my interest with kibertoad on how we can centralize all LRU and FIFO packages to his within Fastify

use our own TTL

I realized while looking at the source code that we already store an iterationStartTime using Date.now(), we are however not doing enough with this information and just returning it back to user. We can increase performance by disabling the TTL in the LRU (since it would otherwise have more Date.now calls) and reseting a request lazily ourselves

move objects and functions out of the main function

Since rate-limit supports being registered multiple times, every time a user does that, we are risking recreating the objects that are inside the main function, which can lead to increased memory usage

change logic order

In places like the onRequest handler, we would do calculations before checking for conditions that would get us out of the logic flow, for instance we would parse the timeWindow string using ms before checking if the hook had already run

bug fixes

I've added tests for the bug fixes

Benchmark (new vs. old, highest of 5 runs):

10 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms  │ 0 ms │ 0.01 ms │ 0.04 ms │ 10 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 48991   │ 48991   │ 50591   │ 66495   │ 54250.19 │ 6412.41 │ 48982   │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 17.9 MB │ 17.9 MB │ 18.9 MB │ 19.9 MB │ 18.9 MB  │ 517 kB  │ 17.8 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

150000 2xx responses, 446750 non 2xx responses
597k requests in 11.01s, 208 MB read
10 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬──────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max  │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼──────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms  │ 0 ms │ 0.01 ms │ 0.04 ms │ 7 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴──────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 47871   │ 47871   │ 49983   │ 66111   │ 53460.37 │ 6605.44 │ 47843   │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 17.7 MB │ 17.7 MB │ 18.7 MB │ 19.7 MB │ 18.6 MB  │ 542 kB  │ 17.7 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

150000 2xx responses, 438058 non 2xx responses
588k requests in 11.01s, 205 MB read
gurgunday commented 1 year ago

Yeah, should've tested Resis as well, will take a second

Uzlopak commented 1 year ago

@kibertoad Do you agree with using toad-cache. I know the maintainer of toad-cache, and this guy seems shady.

kibertoad commented 1 year ago

@Uzlopak I only trust him as far as I can throw him, so no. We need more respectable members of the community in that team for increased confidence, so hopefully we can get some.

(sent invites to you and @gurgunday just in case you are open to it, feel free to decline if not)

Uzlopak commented 1 year ago

:D

Thank you

gurgunday commented 1 year ago

Thanks @kibertoad! 🙏🚀

gurgunday commented 1 year ago

Alright, you’re right, I’ll break them up 🙏

But I think the refactor is important and should land in one way or another