Open lightify97 opened 3 months ago
@lgrammel any pointers?
We're facing this issue too, it's happening on all models, even without langchain. When we reverted back the ai package from v3 to v2, then the streaming works fine, no lag.
@amadk do you have a minimal reproduction that you could share?
ok it seems to be working fine with the minimal reproduction, like with just pure Nextjs (page router) and ai sdk, no library or anything. We're going to try upgrading our UI library and refactoring the components on the chat page to see if it helps. It was just strange because the ai SDK v2 worked fine with our current setup. Will investigate a bit more and send updates.
Any update on this ? Happening on our end as well, can't seem to get it to work!
Hi sorry for the super late response. the problem started occurring for us when we used react-syntax-highlighter with ai SDK. I think the problem is not that ai sdk is laggy, it's actually super fast, the problem is it causes the components to re-render so fast (especially with the fast models like claude sonnet), the page begins to freeze and becomes laggy. So we added a throttle to fix it:
export const useThrottleMessages = (messages) => {
const [throttledMessages, setThrottledMessages] = useState<any>([])
useEffect(() => {
const updateThrottledMessages = throttle((newMessages) => {
setThrottledMessages(newMessages)
}, 50)
updateThrottledMessages(messages)
return () => {
updateThrottledMessages.cancel()
}
}, [messages?.[messages?.length - 1]?.content])
return { throttledMessages }
}
@amadk Great insight, thanks!
Description
Hi! I'm experiencing an issue with stream data response. So mainly there are two issues:
sources
but same data is repeated multiple times as new entries in returned array.chatMode
totext
and the endpoint was not adding any streamData, it was working perfectly fine without any issues.Versions: "ai": "^3.1.30", "@ai-sdk/openai": "^0.0.9", "next": "14.1.4", "@langchain/community": "^0.0.53", "@langchain/core": "^0.1.61", "@langchain/openai": "^0.0.28",
Code example
Here's part of the client side component:
Update
When stream data is removed from the response like below it still lags. So the problem might be related to LangchainAdapter or
stream-data
in theuseChat
hook.Additional context
No response