Nutlope / twitterbio

Generate your Twitter bio with Mixtral and GPT-3.5.
https://www.twitterbio.io
MIT License
1.66k stars 465 forks source link

How to stream response using Next.js Serverless functions instead? #33

Open GorvGoyl opened 1 year ago

GorvGoyl commented 1 year ago

I can't use edge functions as some of the dependencies (Firebase) require node.js so is there a way to stream response from openai and pass it to fronted using serverless functions?

smaeda-ks commented 1 year ago

@GorvGoyl You could technically do that: https://vercel.com/blog/streaming-for-serverless-node-js-and-edge-runtimes-with-vercel-functions

As mentioned in the above blog post, it is only available in a few environments today. For example, for Next.js app, streaming serverless functions are only available for Next.js 13.2+, and in Route Handlers (that are not prerender). pages/api isn't supported.

But, in most cases, you should consider using the edge runtime, as it's (fair to say "very") expensive both from cost and performance wise. Also, in the case of streaming response, while edge functions don't have a hard limit for how long they can continue to stream data after the initial HTTP response, serverless functions are still subject to being limited by its execution timeout limits.

GorvGoyl commented 1 year ago

thanks. I use firebase to authenticate user (api request) first but firebase package isn't officially supported in non-node runtime, hence the reason for choosing serverless functions.

smaeda-ks commented 1 year ago

There's a library such as https://github.com/awinogrodzki/next-firebase-auth-edge that lets you implement the authentication on the edge today.

And you can further control Middleware invocations using the matcher config property: https://nextjs.org/docs/advanced-features/middleware#matcher