nlkitai / nlux

The π—£π—Όπ˜„π—²π—Ώπ—³π˜‚π—Ή Conversational AI JavaScript Library πŸ’¬ β€”Β UI for any LLM, supporting LangChain / HuggingFace / Vercel AI, and more 🧑 React, Next.js, and plain JavaScript ⭐️
https://docs.nlkit.com/nlux
Other
937 stars 48 forks source link

Streaming in Getting Started doc for Next.js + Vercel AI #98

Closed FranciscoMoretti closed 1 week ago

FranciscoMoretti commented 1 week ago

Changes to the getting started doc for Next.js + Vercel IA to do text streaming instead of batch.

For reviewers:

salmenus commented 1 week ago

Thanks for the PR @FranciscoMoretti

Is there an example that I can apply this to as well? None of them use Vercel AI.

No. Right now, live examples on the website do not include Next.js ( they are only React ). You can add an example for Next.js if you want. You may need to update the example runner.

Kept the onFinish empty function just like in Vercel IA sample. I don't have a strong opinion about keeping/removing that piece.

For for me.

Is there a way to simplify the adapter? We can probably offer utils to simplify streaming.

For standardized APIs (such as LangChain LangServe and HuggingFace Inference, developers don't need to write code, they just provide config). We can apply a similar approach maybe for Next.js, but I don't think it should be the only way integrate it with NLUX, as users may want to have more control and flexibility.

βœ… PR Approved. Merging.