fastify / fastify-rate-limit

A low overhead rate limiter for your routes
MIT License
480 stars 67 forks source link

chore: remove defaultOnBanReach helper fn #313

Closed gurgunday closed 1 year ago

gurgunday commented 1 year ago

All custom function parameters, like onExceeded and onExceeding, only get run if they're defined. However, onBan has a default function that is just empty (old, line 281, 268)

Under stress from an IP, that function can get called thousands of times, which probably still wouldn't cause a problem (🤣), but why run it in the first place?

Checklist

Uzlopak commented 1 year ago

@mcollina

Are you sure? Is noop not faster than optional chaining?

gurgunday commented 1 year ago

I actually forgot about this one, but the main thing is onExceeding and onExceeded both had typeof === checks while onBan called an empty function, which you say is better, which I don't object to

The PR is simply one about consistency, they should either all have empty default functions or be optionally called — like how either all options should be type checked or none

PS: I think the difference is extremely small

gurgunday commented 1 year ago

@Uzlopak would you be more comfortable accepting if I reverted the defaultOnBan change and created empty default onExceeded and onExceeding functions? Because they currently have typeof checks that get evaluated during each handler run

We can agree that string comparison is the least performant of the 3

Uzlopak commented 1 year ago

What would be more interesting would be a benchmark

gurgunday commented 1 year ago

No medians, no statistics, just the highest scores I could get after running each of them 10 times. Obviously not a breakthrough improvement, and the difference between the last 2 is definitely non-significant IMO

Again, the intention with this PR was to have the same way of handling defaults within the codebase

master, node 20

150 connections

┌─────────┬──────┬──────┬───────┬──────┬────────┬─────────┬────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg    │ Stdev   │ Max    │
├─────────┼──────┼──────┼───────┼──────┼────────┼─────────┼────────┤
│ Latency │ 2 ms │ 2 ms │ 3 ms  │ 5 ms │ 2.2 ms │ 1.57 ms │ 224 ms │
└─────────┴──────┴──────┴───────┴──────┴────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 47839   │ 47839   │ 51551   │ 51903   │ 51073.46 │ 1245.38 │ 47825   │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 16.8 MB │ 16.8 MB │ 18.7 MB │ 18.8 MB │ 18.5 MB  │ 590 kB  │ 16.8 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

5000 2xx responses, 556809 non 2xx responses
562k requests in 11.01s, 203 MB read

PR, node 20

150 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max    │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼────────┤
│ Latency │ 2 ms │ 2 ms │ 3 ms  │ 5 ms │ 2.16 ms │ 2.12 ms │ 269 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴────────┘
┌───────────┬───────┬───────┬─────────┬─────────┬──────────┬─────────┬───────┐
│ Stat      │ 1%    │ 2.5%  │ 50%     │ 97.5%   │ Avg      │ Stdev   │ Min   │
├───────────┼───────┼───────┼─────────┼─────────┼──────────┼─────────┼───────┤
│ Req/Sec   │ 48351 │ 48351 │ 52543   │ 52959   │ 51780.37 │ 1455.55 │ 48325 │
├───────────┼───────┼───────┼─────────┼─────────┼──────────┼─────────┼───────┤
│ Bytes/Sec │ 17 MB │ 17 MB │ 19.1 MB │ 19.2 MB │ 18.7 MB  │ 659 kB  │ 17 MB │
└───────────┴───────┴───────┴─────────┴─────────┴──────────┴─────────┴───────┘

Req/Bytes counts sampled once per second.
# of samples: 11

5000 2xx responses, 564578 non 2xx responses
570k requests in 11.02s, 206 MB read

PR with empty default functions for onExceeding, onExceeded, onBan, node 20

150 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max    │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼────────┤
│ Latency │ 2 ms │ 2 ms │ 3 ms  │ 5 ms │ 2.17 ms │ 2.13 ms │ 273 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬───────┬─────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%   │ 97.5%   │ Avg      │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼───────┼─────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 47903   │ 47903   │ 52351 │ 52703   │ 51640.73 │ 1499.95 │ 47898   │
├───────────┼─────────┼─────────┼───────┼─────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 16.8 MB │ 16.8 MB │ 19 MB │ 19.1 MB │ 18.7 MB  │ 679 kB  │ 16.8 MB │
└───────────┴─────────┴─────────┴───────┴─────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

5000 2xx responses, 563085 non 2xx responses
568k requests in 11.02s, 206 MB read

Code

import Fastify from 'fastify'
import rateLimit from '../index.js'

const fastify = Fastify()
await fastify.register(rateLimit, { max: 5000, timeWindow: '5h', ban: 1 })

fastify.get('/', async (req, reply) => 'hello!')

fastify.listen({ host: '127.0.0.1', port: 5001 }, (err, address) => {
  if (err) throw err
  console.log(address)
})