Closed EmilioMont21 closed 2 months ago
Hmm, I can reproduce the issue. It seems like something related with trying to bundle server code for browsers. We will look into this further.
Meanwhile, you can look at the vercel ai sdk example in the repo. It seems like the issue is only in the server components
Hey @EmilioMont21, the issue was probably coming from llamaindex. Can you bump the rag-chat version and try again?
Hey @EmilioMont21, the issue was probably coming from llamaindex. Can you bump the rag-chat version and try again?
Great, seems to be working perfectly now, ty
I've been trying to follow the official @upstash/rag-chat documentation and set up an API route in my Next.js project as instructed.
Here’s the code I’m using for the API, same as the documentation:
import { ragChat } from "@/utils/rag-chat";
export const POST = async (req: Request) => { const { message } = await req.json(); const { output } = await ragChat.chat(message, { streaming: true }); return new Response(output); };
I haven’t customized the next.config.js beyond the default setup
I’ve tried reinstalling the dependencies and tweaking Webpack configuration by adding loaders for binary files (like .wasm), but the issue persists. I’m running a clean Next.js project and installed @upstash/rag-chat as per the documentation.
Any insights or suggestions would be greatly appreciated!
Error: