Open aneequrrehman opened 2 months ago
I'm having the same issue, and in fact the Vercel AI Chatbot also has the same bug (to repro: create two chats, type in one, click the other and come back).
So far the only thing that worked for me was manually refreshing the client router like OP suggested.
The problem exists in ai@3.1.30, also for the streamUI
method as well.
I am seeing the same issue and created a bug for ai-chatbot.
I am using ai version 3.0.12
.
While debugging I am seeing issue with useUIState
hook.
Provider
created by createAI
is getting the updated values but useUIState is not returning the updated state.
Description
First of all, thank you for this library - it's great.
Secondly, this might not be an issue and could just be something I'm missing. So, I'm using the Vercel AI SDK with Next.js and experiencing an issue with client-side caching. When a user sends a message, the AI response is streamed to the browser, but if the user navigates away and comes back to the same page, the AI response (last message) is not shown due to the client-side Router cache.
Implementation
I'm loading the messages via server action using the
conversationId
from the URL. This data helps set the initial states for AI and UI viainitialAIState
andinitialUIState
props on the provider returned fromcreateAI
function.I do trigger
revalidatePath
to invalidate the client cache when the user sends the message but the issue arises after the AI responds to a user message - since it's streamed to the browser.I tried moving it to client-side but
createAI
seems to only work server-side.Update:
I could return
InvalidateCache
component but that seems like a hack plus what if user navigates away without letting the stream finish - the client caching problem persists.So, I was wondering is there a plan to implement something for these scenarios. Or am I doing something wrong here?