fastify / fastify-rate-limit

A low overhead rate limiter for your routes
MIT License
505 stars 72 forks source link

Support rate limit groups for specific routes #369

Closed OleksandrBoiko1 closed 1 month ago

OleksandrBoiko1 commented 5 months ago

Prerequisites

🚀 Feature Proposal

It'll be great to have an option to apply rate limit for this two endpoints, not for the whole project. Something that required here. Because solution from issues is not working

Important notice, I mean that rateLimit should calculate calls for this two endpoints, not separate

const routes: FastifyPluginAsync = async function (f) {
  const fastify = f.withTypeProvider<ZodTypeProvider>();

  fastify.post(
    '/',
    {
      schema: {
        tags: SCHEMA_TAGS,
        body: BodySchema,
        response: {
          200: ResponseSchema
        }
      }
    },
    async (req) => {
      // Doing something
    }
  );

  fastify.get(
    '/',
    {
      schema: {
        tags: SCHEMA_TAGS,
        response: {
          200: ResponseSchema
        }
      }
    },
    async (req) => {
      // Doing something
    }
  );
};

export default routes;

Thanks!

Motivation

Usefull feature!

Example

Register fastifyRateLimit only for encapsulate endpoints fastify.register(fastifyRateLimit, { config: { rateLimit: { max: 3, timeWindow: '1 minute' } } })

mcollina commented 5 months ago

Thanks for reporting! Would you like to send a Pull Request to address this issue? Remember to add unit tests.

aniketcodes commented 3 months ago

Hi @mcollina I am able to implement this feature for Redis as a store but not for LRU Cache. Need some help.

mcollina commented 3 months ago

Why? What's the problem? An in-memory implementation should be easier.

aniketcodes commented 3 months ago

In Redis, It is easier to modify the key. As per my understanding for LRU cache, different instance of cache is being created for each route. Incase of redis, same redis instance is being used.

aniketcodes commented 2 months ago

Its done