vercel / next.js

The React Framework
https://nextjs.org
MIT License
125.02k stars 26.7k forks source link

Server-Sent Events don't work in Next API routes #9965

Closed trezy closed 1 year ago

trezy commented 4 years ago

Bug report

Describe the bug

When using Next's API routes, chunks that are written with res.write aren't sent until after res.end() is called.

To Reproduce

Steps to reproduce the behavior, please provide code snippets or a repository:

  1. Create the following API route in a Next app:

    export default async (req, res) => {
    let intervalID = null
    
    res.setHeader('Content-Type', 'text/event-stream')
    res.write('data: CONNECTION ESTABLISHED\n')
    
    const end = () => {
    if (intervalID) {
      clearTimeout(intervalID)
    }
    }
    
    req.on('aborted', end)
    req.on('close', end)
    
    const sendData = () => {
    const timestamp = (new Date).toISOString()
    res.write(`data: ${timestamp}\n`)
    }
    
    intervalID = setInterval(sendData, 1000)
    }
  2. Connect to the route with a tool that supports Server-Sent Events (i.e. Postwoman).

Expected behavior

The route sends a new event to the connection every second.

Actual behavior

The route doesn't send any data to the connection unless a call to res.end() is added to the route.

System information

Additional context

When using other HTTP frameworks (Express, Koa, http, etc) this method works as expected. It's explicitly supported by Node's http.incomingMessage and http.ServerResponse classes which, from what I understand, Next uses as a base for the req and res that are passed into Next API routes.

I'd hazard a guess that #5855 was caused by the same issue, but considered unrelated because the issue was obscured by the express-sse library.

There are also two Spectrum topics about this (here and here) that haven't garnered much attention yet.

Supporting Websockets and SSE in Next API routes may be related, but fixing support for SSE should be a lower barrier than adding support Websockets. All of the inner workings are there, we just need to get the plumbing repaired.

trezy commented 4 years ago

I forgot to mention that this works in Micro routes, as well. I'm trying to eliminate the need for my Micro API by moving everything into Next, but this is a blocker for me.

msand commented 4 years ago

You can use a custom server.js to workaround this for now:

require('dotenv').config();
const app = require('express')();
const server = require('http').Server(app);
const next = require('next');

const DSN = process.env.DSN || 'postgres://postgres:postgres@localhost/db';
const dev = process.env.NODE_ENV !== 'production';

const nextApp = next({ dev });
const nextHandler = nextApp.getRequestHandler();

nextApp.prepare().then(() => {
  app.get('*', (req, res) => {
    if (req.url === '/stream') {
      res.writeHead(200, {
        Connection: 'keep-alive',
        'Cache-Control': 'no-cache',
        'Content-Type': 'text/event-stream',
      });
      res.write('data: Processing...\n\n');
      setTimeout(() => {
        res.write('data: Processing2...\n\n');
      }, 10000);
    } else {
      return nextHandler(req, res);
    }
  });

  require('../websocket/initWebSocketServer')(server, DSN);

  const port = 8080;
  server.listen(port, err => {
    if (err) throw err;
    console.log('> Ready on http://localhost:' + port);
  });
});
componentDidMount() {
    this.source = new EventSource('/stream')
    this.source.onmessage = function(e) {
      console.log(e)
    }
}
msand commented 4 years ago

I would still recommend to keep any server sent event and websocket handlers in separate processes in production. It's very likely that the frequency of updates to those parts of the business logic are quite different. Your front-end most likely changes more often than the types of events you handle / need to push to the clients from the servers. If you only make changes to one, you probably don't want to restart the processes responsible for the other(s). Better to keep the connections alive rather than cause a flood of reconnections / server restarts for changes which have no effect.

trezy commented 4 years ago

@msand The main reason I'm trying to avoid using a custom server is that I'm deploying to Now. Using a custom server would break all of the wonderful serverless functionality I get there.

Your second point is fair. What I'm trying to do is create an SSE stream for data that would otherwise be handled with basic polling. The server is already dealing with constant reconnections in that case, so an SSE stream actually results in fewer reconnections.

I suppose I could set up a small webserver in the same repo that just uses a separate Now builder. That would allow the processes to remain separate, though it'd still cause all of the SSE connections to abort and reconnect when there are any changes to the project.

Even with those points, I can see plenty of scenarios in which it makes sense to be able to run an SSE endpoint from one of Next's API routes. Additionally, in the docs it's specifically stated that...

Since it's specifically stated that res is an instance of http.ServerResponse, I'd expect it to behave exactly the way http.ServerResponse behaves in any other circumstance. Either the documentation should change to reflect the quirks of the implementation or, preferably, res.write should be fixed to behave the way it does in any other circumstance.

msand commented 4 years ago

@trezy It seems the issue is that the middleware adds a gzip encoding which the browser has negotiated using the header:

Accept-Encoding: gzip, deflate, br

If you add

Content-Encoding: none

then it seems to work:

  res.writeHead(200, {
    Connection: 'keep-alive',
    'Content-Encoding': 'none',
    'Cache-Control': 'no-cache',
    'Content-Type': 'text/event-stream',
  });
msand commented 4 years ago

Alternatively, gzip your content

trezy commented 4 years ago

Oh, that's super interesting! I'll give that a shot and report back. In the meantime, it'd still be nice for this quirk (and any similar ones) to be noted somewhere in the docs.

msand commented 4 years ago

Yeah, it's more a consequence of having some helpers, would be nice with a mode which can turn all of it off, and only makes it a plain req res pair

msand commented 4 years ago

Actually, this seems to be documented here: https://github.com/expressjs/compression#server-sent-events

Have to call res.flush() when you think there's enough data for the compression to work efficiently

export default (req, res) => {
  res.writeHead(200, {
    'Cache-Control': 'no-cache',
    'Content-Type': 'text/event-stream',
  });
  res.write('data: Processing...');
  /* https://github.com/expressjs/compression#server-sent-events
    Because of the nature of compression this module does not work out of the box with
    server-sent events. To compress content, a window of the output needs to be
    buffered up in order to get good compression. Typically when using server-sent
    events, there are certain block of data that need to reach the client.

    You can achieve this by calling res.flush() when you need the data written to
    actually make it to the client.
*/
  res.flush();
  setTimeout(() => {
    res.write('data: Processing2...');
    res.flush();
  }, 1000);
};
msand commented 4 years ago

It then applies gzip compression for you

uxFeranmi commented 4 years ago

I have switched to using a custom express server. That's the only way I could get it to work. I guess that's cool since I can do more with express.

Before deciding to integrate express, I had tried the things mentioned above, none worked.

  1. Turned off gzip compression by setting the option in next.config.js. The behavior remained the same. I inspected the headers on the client (using postman) and confirmed the gzip encoding was removed, but that didn't seem to fix the problem.

  2. Calling res.flush had no effect either. Instead I get a warning in the console that flush is deprecated and to use flushHeaders instead. But that's not what I want.

This is a rather strange bug.. 😔

On Thursday, 9 January 2020, Mikael Sand notifications@github.com wrote:

It then applies gzip compression for you

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/zeit/next.js/issues/9965?email_source=notifications&email_token=AHJC5ZAG5IHQYSMKYTI2QBTQ46QBFA5CNFSM4KDHWFMKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEISAHXQ#issuecomment-572785630, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJC5ZDO2YUJFN5JLQPLYGLQ46QBFANCNFSM4KDHWFMA .

kavuri commented 4 years ago

I have been trying to get SSE work in nextjs, but could not get working. With custom server and native node httpServer req,res it works, but with Nextjs 'res', no messages are sent to the client. I started using Nextjs to get the advantages of server-side rendering, SLS and having the server and client code together. Using express beats the purpose Any pointers on how this could work? This is a blocking problem for me

uxFeranmi commented 4 years ago

Hey @kavuri. It is possible to integrate a custom Node.js server (e.g using express) with your next.js app. That way, you can still get Server-Side Rendering without these Next.js limitations.

See this page of the official documentation for details: https://nextjs.org/docs/advanced-features/custom-server

Also, check out how I implemented this in my own app which I mentioned in the comment above yours: https://github.com/uxFeranmi/react-woocommerce/blob/master/server.js

kavuri commented 4 years ago

@uxFeranmi I could use the custom server method as mentioned here https://nextjs.org/docs/advanced-features/custom-server to write messages as res.write(...). But in the Next app, I do not see any messages in my page I have created a sample page index.js and a react component App.js in pages dir as under

import EventSource from 'eventsource'

class App extends React.Component {
   constructor(props) {
      super(props) 
      this.events = new EventSource('http://localhost:3000/test')
      this.events.onopen = function() {
        console.log('connection is opened');
      }
      this.events.onerror = function() {
        console.log('error in opening conn.');
      }
   }
   componentDidMount() {
      this.events.onmessage = (event) => {
         console.log('got message..',event)
         this.data = JSON.parse(event.data)
      }
   }

   componentWillUnmount() {
      // cleanup
   }

   render() {
      return (
         <div>
           <h1>{this.data}</h1>
         </div>
      );
   }
}

index.js

import App from './App.js'

function HomePage() {
  return <div><App /></div>
}
export default HomePage

My custom server.js

const { createServer } = require('http')
const { parse } = require('url')
const next = require('next')
const fs = require('fs')

const port = parseInt(process.env.PORT, 10) || 3000
const dev = process.env.NODE_ENV !== 'production'
const app = next({ dev })
const handle = app.getRequestHandler()

console.log('starting server...')
function listen(req, res) {
    console.log('listening for incoming orders...');
    // Create a change stream. The 'change' event gets emitted when there's a
    // change in the database
    fs.watch('./', (eventType, filename) => {
      if (filename) {
        var obj = {"text": filename}
        console.log('sending:',obj);
        res.write('data:' + JSON.stringify(obj));
      }
    });

    res.on('close', () => {
       console.log('closing connection');
    });
}

app.prepare().then(() => {
  createServer((req, res) => {
    const parsedUrl = parse(req.url, true)
    const { pathname, query } = parsedUrl

    if (pathname === '/test') {
         const headers = {
        'Content-Type': 'text/event-stream',
        'Connection': 'keep-alive',
        'Cache-Control': 'no-cache'
      };
      res.writeHead(200, headers);
      res.write('\n')
      listen(req, res)
    } else {
      handle(req, res, parsedUrl)
    }
  }).listen(port, err => {
    if (err) throw err
    console.log(`> Ready on http://localhost:${port}`)
  })
})

I am not getting any message in the index page. But if I open the url http://localhost:3000/test, I get the messages, which means that the EventSource itself is working, but the Next server side rendering for the eventsource is not. Or maybe I am doing something wrong! Any pointers?

kavuri commented 4 years ago

I have created a small test to trigger the event. Basically, just create a file in the project root directory (say just touch <filename>

uxFeranmi commented 4 years ago

I don't think you want to setup the eventsource in the constructor. I think you should put it in componentDidMount() same as your message listener.

You can not start the eventsource on the server-side because then subsequent messages will be sent to the server, not the browser/client. So you have to initialize the eventsource after the component has been rendered in the browser.

Do this either with the useEffect hook for function components or in your case, I believe it should be in componentDidMount().

PS: You probably don't need to import 'eventsource'.

kavuri commented 4 years ago

@uxFeranmi I have moved the event opening code and the corresponding functions to componentDidMount(), but no effect. I still do not see any messages in the console nor on the screen.

I am on the verge of giving up on Next and moving back to Express and a standalone Web UI server

uxFeranmi commented 4 years ago

In my case, the SSE was triggered by clicking a button on the page. The click event called this function:

const authenticate = (email, callback)=> {
  const sse = new EventSource(`/api/auth/sign-in?email=${email}`);

  sse.addEventListener("message", (e)=> {
    console.log('Default message event\n', e);
  });

  sse.addEventListener("received", (e)=> {
    const {type: event, data} = e;
    callback({event, data});
    console.log(`${event}: ${data}`);
  });

  sse.addEventListener("mailsent", (e)=> {
    const {type: event, data} = e;
    callback({event, data});
    console.log(`${event}: ${data}`);
  });

  sse.addEventListener("authenticated", (e)=> {
    const {type: event, data} = e;
    callback({event, data});
    console.log(`${event}: ${data}`);
    sse.close();
  });

  sse.addEventListener("timeout", (e)=> {
    const {type: event, data} = e;
    callback({event, data});
    console.log(`${event}: ${data}`);
    sse.close();
  });

  sse.addEventListener("error", (e)=> {
    const {type: event, data} = e;
    let customData = '';

    // If connection is closed.
    // 0 — connecting, 1 — open, 2 — closed
    if (sse.readyState === 2) {
      console.log('SSE closed', e);
      customData = "Connection to server was lost and couldn't be re-established.";
    }

    // If still connected & it's an unknown error, attempt reconnection.
    else if (!data) return console.log('Reconnecting SSE...');

    sse.close();
    console.log('Closed SSE...');
    console.log(`${event}: ${customData || data}`);
    callback({event, data: customData || data});
  });
};

export default authenticate;

This function simply takes in a value to use in the url query parameter (you'll need to remove this since yours is a fixed url), and a callback function. The callback should be setState, so that the data from each new SSE event is put in your components state. (You may need to modify this too since I'm using the useState hook and you're using class components).

I'm sorry if this isn't very helpful. I can't figure out much just from the code snippet you shared, so I'm showing you my own code hoping it'll work for you.

kavuri commented 4 years ago

I could see that my API endpoint is getting triggered by the UI from the networking tab in chrome inspector, but no updates are seen in the browser

uxFeranmi commented 4 years ago

There's a chance this is related to the gzip compression. Try sending a final message with res.end().

On Monday, February 10, 2020, Kavuri notifications@github.com wrote:

I could see that my API endpoint is getting triggered by the UI from the networking tab in chrome inspector, but no updates are seen in the browser

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/zeit/next.js/issues/9965?email_source=notifications&email_token=AHJC5ZDZ7ITWKYA27O2RFHTRCFOJHA5CNFSM4KDHWFMKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOELIUVTQ#issuecomment-584141518, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJC5ZFEX2CDGAQRX3A3RY3RCFOJHANCNFSM4KDHWFMA .

uxFeranmi commented 4 years ago

Gzip compression can cause all events/messages to be queued up until you call res.end(), then all messages are sent at once. Check devtools for the content type in your response headers if it's gzip.

On Monday, February 10, 2020, Feranmi Akinlade uxferanmi@gmail.com wrote:

There's a chance this is related to the gzip compression. Try sending a final message with res.end().

On Monday, February 10, 2020, Kavuri notifications@github.com wrote:

I could see that my API endpoint is getting triggered by the UI from the networking tab in chrome inspector, but no updates are seen in the browser

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/zeit/next.js/issues/9965?email_source=notifications&email_token=AHJC5ZDZ7ITWKYA27O2RFHTRCFOJHA5CNFSM4KDHWFMKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOELIUVTQ#issuecomment-584141518, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHJC5ZFEX2CDGAQRX3A3RY3RCFOJHANCNFSM4KDHWFMA .

ijjk commented 4 years ago

Hi, server-sent events from an API endpoint appear to be working correctly in Next.js itself. Compression shouldn't be affecting the stream as long as you set res.setHeader('Cache-Control', 'no-cache, no-transform') specifically the no-transform bit, related compression code.

Note: they will not work in a serverless environment since those environments are typically buffered and don't allow streaming the response from the lambda. Related AWS Lambda docs here. If you want to create a pub-sub system, services like pusher.com are better suited for this and compliment deploying your applications on ZEIT or other serverless environments very well

Here's a gif of it working locally without any custom next.config.js or a custom-server

api-routes-sse

wenerme commented 4 years ago

Code from above comment, if you want to try

import {NextApiRequest, NextApiResponse} from 'next'

export const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));

// curl -Nv localhost:3000/api/see
const handler = async (req: NextApiRequest, res: NextApiResponse) => {
  res.setHeader('Access-Control-Allow-Origin', '*');
  res.setHeader('Content-Type', 'text/event-stream;charset=utf-8');
  res.setHeader('Cache-Control', 'no-cache, no-transform');
  res.setHeader('X-Accel-Buffering', 'no');

  for (let i = 0; i < 5; i++) {
    res.write(`data: Hello seq ${i}\n\n`);
    await sleep(1000);
  }
  res.end('done\n');
};

export default handler;
trezy commented 4 years ago

Thanks for all the effort on this ticket, folx! I wanted to pop in and say I think we can mark it as resolved. Here's a quick TL;DR:

ijjk commented 4 years ago

Closing per https://github.com/zeit/next.js/issues/9965#issuecomment-614823642

leinadpb commented 4 years ago

@trezy It seems the issue is that the middleware adds a gzip encoding which the browser has negotiated using the header:

Accept-Encoding: gzip, deflate, br

If you add

Content-Encoding: none

then it seems to work:

  res.writeHead(200, {
    Connection: 'keep-alive',
    'Content-Encoding': 'none',
    'Cache-Control': 'no-cache',
    'Content-Type': 'text/event-stream',
  });

Thanks! This was causing issues when deployed in OpenShift. I'm returning this header with "none" from my Spring WebFlux SSE Service and is working as expected now!

stophecom commented 3 years ago

For those wondering… for Vercel/Now you can find an explanation on why streams are not supported: https://vercel.com/docs/platform/limits#streaming-responses

balazsorban44 commented 2 years ago

This issue has been automatically locked due to no recent activity. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.

leerob commented 1 year ago

For those stumbling onto this through Google, this is working as of Next.js 13 + Route Handlers:

// app/api/route.ts
import { Configuration, OpenAIApi } from 'openai';

export const runtime = 'nodejs';
// This is required to enable streaming
export const dynamic = 'force-dynamic';

export async function GET() {
  const configuration = new Configuration({
    apiKey: process.env.OPENAI_API_KEY,
  });
  const openai = new OpenAIApi(configuration);

  let responseStream = new TransformStream();
  const writer = responseStream.writable.getWriter();
  const encoder = new TextEncoder();

  writer.write(encoder.encode('Vercel is a platform for....'));

  try {
    const openaiRes = await openai.createCompletion(
      {
        model: 'text-davinci-002',
        prompt: 'Vercel is a platform for',
        max_tokens: 100,
        temperature: 0,
        stream: true,
      },
      { responseType: 'stream' }
    );

    // @ts-ignore
    openaiRes.data.on('data', async (data: Buffer) => {
      const lines = data
        .toString()
        .split('\n')
        .filter((line: string) => line.trim() !== '');
      for (const line of lines) {
        const message = line.replace(/^data: /, '');
        if (message === '[DONE]') {
          console.log('Stream completed');
          writer.close();
          return;
        }
        try {
          const parsed = JSON.parse(message);
          await writer.write(encoder.encode(`${parsed.choices[0].text}`));
        } catch (error) {
          console.error('Could not JSON parse stream message', message, error);
        }
      }
    });
  } catch (error) {
    console.error('An error occurred during OpenAI request', error);
    writer.write(encoder.encode('An error occurred during OpenAI request'));
    writer.close();
  }

  return new Response(responseStream.readable, {
    headers: {
      'Content-Type': 'text/event-stream',
      Connection: 'keep-alive',
      'Cache-Control': 'no-cache, no-transform',
    },
  });
}
leerob commented 1 year ago

I'm going to unlock this because I've been sent it a handful of times, so it must be coming up in Google searches more often. Will transfer to a discussion instead of an issue 👍