onur-ozkan / nestjs-rate-limiter

Highly configurable and extensible rate limiter library
https://npmjs.com/package/nestjs-rate-limiter
MIT License
238 stars 43 forks source link

Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client #22

Closed Langstra closed 4 years ago

Langstra commented 4 years ago

First of all, thanks for taking over the maintenance of the package. Would be a shame to see more packages go to waste.

I installed the packages according to the docs, adding RateLimiterModule.register(rateLimitConfig) to the imports array of AppModule and adding { provide: APP_INTERCEPTOR, useClass: RateLimiterInterceptor, }, at the end of the providers array. And my config is const rateLimitConfig: RateLimiterModuleOptions = { points: 10, };

This gives the following error:

console.error
    Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
        at ServerResponse.setHeader (_http_outgoing.js:533:11)
        at ServerResponse.header (/path/to/project/node_modules/express/lib/response.js:771:10)
        at ServerResponse.send (/path/to/project/node_modules/express/lib/response.js:170:12)
        at ServerResponse.json (/path/to/project/node_modules/express/lib/response.js:267:15)
        at ExpressAdapter.reply (/path/to/project/node_modules/@nestjs/platform-express/adapters/express-adapter.js:24:57)
        at ExceptionsHandler.handleUnknownError (/path/to/project/node_modules/@nestjs/core/exceptions/base-exception-filter.js:33:24)
        at ExceptionsHandler.catch (/path/to/project/node_modules/@nestjs/core/exceptions/base-exception-filter.js:17:25)
        at ExceptionsHandler.next (/path/to/project/node_modules/@nestjs/core/exceptions/exceptions-handler.js:16:20)
        at /path/to/project/node_modules/@nestjs/core/router/router-proxy.js:13:35
        at processTicksAndRejections (internal/process/task_queues.js:97:5)

The rate limiting is applied however.

Also I was wondering how I can disable the rate limiting while testing the application using jest. (maybe I could change the config before creating the application).

onur-ozkan commented 4 years ago

First of all, thanks for taking over the maintenance of the package. Would be a shame to see more packages go to waste.

I installed the packages according to the docs, adding RateLimiterModule.register(rateLimitConfig) to the imports array of AppModule and adding { provide: APP_INTERCEPTOR, useClass: RateLimiterInterceptor, }, at the end of the providers array. And my config is const rateLimitConfig: RateLimiterModuleOptions = { points: 10, };

This gives the following error:

console.error
    Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
        at ServerResponse.setHeader (_http_outgoing.js:533:11)
        at ServerResponse.header (/path/to/project/node_modules/express/lib/response.js:771:10)
        at ServerResponse.send (/path/to/project/node_modules/express/lib/response.js:170:12)
        at ServerResponse.json (/path/to/project/node_modules/express/lib/response.js:267:15)
        at ExpressAdapter.reply (/path/to/project/node_modules/@nestjs/platform-express/adapters/express-adapter.js:24:57)
        at ExceptionsHandler.handleUnknownError (/path/to/project/node_modules/@nestjs/core/exceptions/base-exception-filter.js:33:24)
        at ExceptionsHandler.catch (/path/to/project/node_modules/@nestjs/core/exceptions/base-exception-filter.js:17:25)
        at ExceptionsHandler.next (/path/to/project/node_modules/@nestjs/core/exceptions/exceptions-handler.js:16:20)
        at /path/to/project/node_modules/@nestjs/core/router/router-proxy.js:13:35
        at processTicksAndRejections (internal/process/task_queues.js:97:5)

The rate limiting is applied however.

Also I was wondering how I can disable the rate limiting while testing the application using jest. (maybe I could change the config before creating the application).

Thanks! I also see that error you mentioned with 7.x version of Nest. What I am guessing is that error is belongs to Nest, not to this package. However, I will work on this issue to see what's actually makes this error.

onur-ozkan commented 4 years ago

@Langstra And for the last question, you can enable interceptor conditionally to only have rate limiting in production or not in tests.

Langstra commented 4 years ago

Hmm... It might be an error that nest throws, but I guess that the interceptor tries to set headers while nest has already sent them. Shouldn't it set headers before nest sends the requests?

As for disabling the interceptor, we have a test in which we verify that the rate limiter is added as interceptor. This test needs the interceptor to be enabled, while all other tests need it disabled.

wallopthecat commented 4 years ago

I can confirm this behavior on Nest 7 - what can I do to help?

onur-ozkan commented 4 years ago

@wallopthecat I was about to work on that this weekend. But if you want to help, sure! Tell me if you want to take this issue, so I can assign you to this issue.

Langstra commented 4 years ago

We think that we found the problem.

await this.responseHandler(response, key, rateLimiter, points, pointsConsumed)
return next.handle()

After handling the rate limit check you always handle the response. That will trigger the controller method which then will set headers again, right?

Also as for disabling the interceptor in the tests. Maybe you can add something like

if (process.env.DISABLE_RATE_LIMITING) {
      logger.debug('Rate limiting disabled');
      return next.handle();
    }

Which would make it very easy to disable it for certain tests, but not all, at runtime.

bkulgat commented 4 years ago

Here is quick fix, let error response handling to framework instead of custom response generating inside interceptor.

// change from 
await this.responseHandler(response, key, rateLimiter, points, pointsConsumed)
return next.handle()

// to
return await this.responseHandler(response, key, rateLimiter, points, pointsConsumed, next)

and responseHandler like;

private async responseHandler(
    response: any,
    key: any,
    rateLimiter: RateLimiterMemory,
    points: number,
    pointsConsumed: number,
    next: CallHandler,
  ): Promise<any> {
    if (this.options.for === 'Fastify' || this.options.for === 'FastifyGraphql') {
      try {
        const rateLimiterResponse: RateLimiterRes = await rateLimiter.consume(key, pointsConsumed);

        response.header('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000));
        response.header('X-RateLimit-Limit', points);
        response.header('X-Retry-Remaining', rateLimiterResponse.remainingPoints);
        response.header('X-Retry-Reset', new Date(Date.now() + rateLimiterResponse.msBeforeNext).toUTCString());

        return next.handle();
      } catch (rateLimiterResponse) {
        throw new HttpException({
          statusCode: HttpStatus.TOO_MANY_REQUESTS,
          error: 'Too Many Requests',
          message: this.options.errorMessage,
          'retry-after': Math.ceil(rateLimiterResponse.msBeforeNext / 1000),
        }, HttpStatus.TOO_MANY_REQUESTS);
      }
    } else {
      try {
        const rateLimiterResponse: RateLimiterRes = await rateLimiter.consume(key, pointsConsumed);

        response.set('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000));
        response.set('X-RateLimit-Limit', points);
        response.set('X-Retry-Remaining', rateLimiterResponse.remainingPoints);
        response.set('X-Retry-Reset', new Date(Date.now() + rateLimiterResponse.msBeforeNext).toUTCString());

        return next.handle();
      } catch (rateLimiterResponse) {
        throw new HttpException({
          statusCode: HttpStatus.TOO_MANY_REQUESTS,
          error: 'Too Many Requests',
          message: this.options.errorMessage,
          'retry-after': Math.ceil(rateLimiterResponse.msBeforeNext / 1000),
        }, HttpStatus.TOO_MANY_REQUESTS);
      }
    }
  }
Charul090 commented 3 years ago

Still facing same issue

onur-ozkan commented 3 years ago

Which version are you using ? @Charul090

bkulgat commented 3 years ago

same issue on rate-limiter.interceptor.ts

private async responseHandler(response: any, key: any, rateLimiter: RateLimiterAbstract, points: number, pointsConsumed: number, next: CallHandler) {
    if (this.options.for === 'Fastify' || this.options.for === 'FastifyGraphql') {
      try {
        if (this.spesificOptions?.queueEnabled || this.options.queueEnabled) await this.queueLimiter.removeTokens(1)
        else {
          const rateLimiterResponse: RateLimiterRes = await rateLimiter.consume(key, pointsConsumed)

          response.header('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000))
          response.header('X-RateLimit-Limit', points)
          response.header('X-Retry-Remaining', rateLimiterResponse.remainingPoints)
          response.header('X-Retry-Reset', new Date(Date.now() + rateLimiterResponse.msBeforeNext).toUTCString())
        }
      } catch (rateLimiterResponse) {
        response.header('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000))
        response
          .code(429)
          .header('Content-Type', 'application/json; charset=utf-8')
          .send({
            statusCode: HttpStatus.TOO_MANY_REQUESTS,
            error: 'Too Many Requests',
            message: this.spesificOptions?.errorMessage || this.options.errorMessage
          })
      }
      return next.handle()
    } else {
      try {
        if (this.spesificOptions?.queueEnabled || this.options.queueEnabled) await this.queueLimiter.removeTokens(1)
        else {
          const rateLimiterResponse: RateLimiterRes = await rateLimiter.consume(key, pointsConsumed)

          response.set('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000))
          response.set('X-RateLimit-Limit', points)
          response.set('X-Retry-Remaining', rateLimiterResponse.remainingPoints)
          response.set('X-Retry-Reset', new Date(Date.now() + rateLimiterResponse.msBeforeNext).toUTCString())
        }
      } catch (rateLimiterResponse) {
        response.set('Retry-After', Math.ceil(rateLimiterResponse.msBeforeNext / 1000))
        response.status(429).json({
          statusCode: HttpStatus.TOO_MANY_REQUESTS,
          error: 'Too Many Requests',
          message: this.spesificOptions?.errorMessage || this.options.errorMessage
        })
      }
      return next.handle()
    }
  }

return next.handle() should be inside try-catch block. it still trying to continue next handler even there is exception thrown.

onur-ozkan commented 3 years ago

@bkulgat Then it causes another issue #1

onur-ozkan commented 3 years ago

Finally I fixed both #1 and this issue without breaking any of them :smile: v2.5.6 should be working without facing either this or #1 problem.