fastify / fastify-rate-limit

A low overhead rate limiter for your routes
MIT License
477 stars 66 forks source link

Support rate limit groups for specific routes #369

Open OleksandrBoiko1 opened 2 months ago

OleksandrBoiko1 commented 2 months ago

Prerequisites

🚀 Feature Proposal

It'll be great to have an option to apply rate limit for this two endpoints, not for the whole project. Something that required here. Because solution from issues is not working

Important notice, I mean that rateLimit should calculate calls for this two endpoints, not separate

const routes: FastifyPluginAsync = async function (f) {
  const fastify = f.withTypeProvider<ZodTypeProvider>();

  fastify.post(
    '/',
    {
      schema: {
        tags: SCHEMA_TAGS,
        body: BodySchema,
        response: {
          200: ResponseSchema
        }
      }
    },
    async (req) => {
      // Doing something
    }
  );

  fastify.get(
    '/',
    {
      schema: {
        tags: SCHEMA_TAGS,
        response: {
          200: ResponseSchema
        }
      }
    },
    async (req) => {
      // Doing something
    }
  );
};

export default routes;

Thanks!

Motivation

Usefull feature!

Example

Register fastifyRateLimit only for encapsulate endpoints fastify.register(fastifyRateLimit, { config: { rateLimit: { max: 3, timeWindow: '1 minute' } } })

mcollina commented 2 months ago

Thanks for reporting! Would you like to send a Pull Request to address this issue? Remember to add unit tests.

aniketcodes commented 1 month ago

Hi @mcollina I am able to implement this feature for Redis as a store but not for LRU Cache. Need some help.

mcollina commented 1 month ago

Why? What's the problem? An in-memory implementation should be easier.

aniketcodes commented 1 month ago

In Redis, It is easier to modify the key. As per my understanding for LRU cache, different instance of cache is being created for each route. Incase of redis, same redis instance is being used.

aniketcodes commented 4 days ago

Its done