vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
8.58k stars 1.21k forks source link

Router cache issue with NextJS and Vercel AI SDK #1465

Open aneequrrehman opened 2 months ago

aneequrrehman commented 2 months ago

Description

First of all, thank you for this library - it's great.

Secondly, this might not be an issue and could just be something I'm missing. So, I'm using the Vercel AI SDK with Next.js and experiencing an issue with client-side caching. When a user sends a message, the AI response is streamed to the browser, but if the user navigates away and comes back to the same page, the AI response (last message) is not shown due to the client-side Router cache.

Implementation

I'm loading the messages via server action using the conversationId from the URL. This data helps set the initial states for AI and UI via initialAIState and initialUIState props on the provider returned from createAI function.


export default async function ChatPage({ params: { id } }) {
    const aiConversation = await getAiConversation(id)

    if (aiConversation === null) {
        return <div>Chat not found...</div>
    }

    return <ChatAiProvider
        initialAIState={{
        aiConversationId: aiConversation.id,
        messages: aiConversation.messages.map((m) => ({
                role: m.from === 'ai' ? 'assistant' : 'user',
            content: m.content,
        })),
    }}
    initialUIState={aiConversation.messages.map((m) => ({
            id: m.id,
        display: m.from === 'ai' ? (
                <AiMessage>
            <Markdown>{m.content}</Markdown>
        </AiMessage>
        ) : (
            <UserMessage>{m.content}</UserMessage>
        ),
    }))}
    >
         <Chat />
    </ChatAiProvider>
}

I do trigger revalidatePath to invalidate the client cache when the user sends the message but the issue arises after the AI responds to a user message - since it's streamed to the browser.

I tried moving it to client-side but createAI seems to only work server-side.

Update:

I could return InvalidateCache component but that seems like a hack plus what if user navigates away without letting the stream finish - the client caching problem persists.

...
const ui = render({
    model: 'gpt-3.5-turbo',
    provider: openai,
    messages: aiState.get().messages,
    text: ({ content, done }) => {
        if (done) {
            aiState.done({
                ...aiState.get(),
                messages: [
                    ...aiState.get().messages,
                    {
                        role: 'assistant',
                        content,
                    },
                ],
            })
        }

        return (
            <>
                <AiMessage>
                    <Markdown>{content}</Markdown>
                </AiMessage>
                <InvalidateCache /> <-- this component can invalidate client cache using router.refresh()
            </>
        )
    },
})

return {
    id: Date.now().toString(),
    display: ui,
}

So, I was wondering is there a plan to implement something for these scenarios. Or am I doing something wrong here?

jennnx commented 4 weeks ago

I'm having the same issue, and in fact the Vercel AI Chatbot also has the same bug (to repro: create two chats, type in one, click the other and come back).

So far the only thing that worked for me was manually refreshing the client router like OP suggested. The problem exists in ai@3.1.30, also for the streamUI method as well.

pravinjakfs commented 3 weeks ago

I am seeing the same issue and created a bug for ai-chatbot. I am using ai version 3.0.12. While debugging I am seeing issue with useUIState hook. Provider created by createAI is getting the updated values but useUIState is not returning the updated state.