upstash / ratelimit-js

Rate limiting library for serverless runtimes
https://ratelimit-with-vercel-kv.vercel.app
MIT License
1.66k stars 33 forks source link

Update Next.js Middleware example #12

Closed ghost closed 2 years ago

ghost commented 2 years ago

Hey guys! Figured I'd throw an issue in here to comment on the new middleware changes from: https://nextjs.org/docs/messages/middleware-upgrade-guide#no-response-body

The following patterns will no longer work (used in examples/nextjs/pages/api/_middleware.ts):

new Response('a text value')
new Response(streamOrBuffer)
new Response(JSON.stringify(obj), { headers: 'application/json' })
NextResponse.json()

Is below an appropriate replacement given the new middleware? In the docs, they say:

To produce a response from Middleware, you should rewrite to a route (Page or Edge API Route) that produces a response.

import type { NextRequest, NextFetchEvent } from "next/server"
import { NextResponse } from "next/server"
import { Ratelimit } from "@upstash/ratelimit"
import { Redis } from "@upstash/redis"

const url = process.env.UPSTASH_REDIS_REST_URL!
const token = process.env.UPSTASH_REDIS_REST_TOKEN!

export const middleware = async (req: NextRequest, event: NextFetchEvent) => {
  const ip = req.ip ?? "127.0.0.1"
  const redis = new Redis({
    url,
    token
  })
  // Create a new ratelimiter, that allows 10 requests per 10 seconds
  const ratelimit = new Ratelimit({
    redis: redis,
    limiter: Ratelimit.slidingWindow(10, "10 s")
  })
  const { success, pending } = await ratelimit.limit(ip)

  event.waitUntil(pending)

  if (!success) {
    return NextResponse.rewrite(new URL("/rate-limit", req.url))
  }
}
chronark commented 2 years ago

Hey @thomaswang, thanks for letting us know. I've updated the example to this:

import { NextFetchEvent, NextRequest, NextResponse } from "next/server";
import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";

const ratelimit = new Ratelimit({
  redis: Redis.fromEnv(),
  limiter: Ratelimit.fixedWindow(10, "10 s"),
});

export default async function middleware(
  request: NextRequest,
  event: NextFetchEvent,
): Promise<Response | undefined> {
  const ip = request.ip ?? "127.0.0.1";
  const { success, pending, limit, reset, remaining } = await ratelimit.limit(`mw_${ip}`);
  event.waitUntil(pending);

  const res = success
    ? NextResponse.next(request)
    : NextResponse.rewrite(new URL("/api/blocked", request.url), request);

  res.headers.set("X-RateLimit-Limit", limit.toString());
  res.headers.set("X-RateLimit-Remaining", remaining.toString());
  res.headers.set("X-RateLimit-Reset", reset.toString());
  return res;
}

export const config = {
  matcher: "/api/hello",
};

Important difference from your suggestion: The ratelimit instance is outside the handler, allowing to cache limits between invocations.