vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
8.62k stars 1.21k forks source link

UI Update Failure in Production with AI SDK 3.2.14 on Vercel Edge Runtime #2131

Open steveoOn opened 1 week ago

steveoOn commented 1 week ago

Description

After upgrading the AI SDK to version 3.2.14, an issue occurs when using the streamUI function with tools.generate. Specifically, when creating createStreamableUI and using its update method to refresh the UI within the function, the page does not update as expected, even though the data is successfully fetched on the server side.

Importantly, this issue does not manifest in the development environment (running with pnpm dev on localhost:3000). However, when deployed to Vercel (Edge Runtime) in the production environment, the problem becomes apparent. The server-side logic executes correctly, and responses are returned normally, but the UI fails to update. There are no error logs visible.

Downgrading to AI SDK version 3.1.32 resolves the issue in the production environment.

Additionally, there's a minor compatibility issue between AI SDK version 3.1.32 and the latest @ai-sdk/anthropic version 0.0.27. When creating an Anthropic instance and passing the model parameter in streamText, a type incompatibility error is thrown. This is resolved by upgrading to AI SDK 3.2.14, but introduces the aforementioned UI update problem.

Code example

const res = await streamUI({
  model: openai('gpt-4o'),
  messages: [
    ...messagesMatch.map((msg: CoreMessage) => ({
      role: msg.role,
      content: msg.content,
    })),
  ],
  system: `\
    you are a friendly assistant!
    If the user asks for the weather, call the tool showWeathers.
    If the user asks for information which is needs to be searched on the internet, call the tool searchInternet.`,
  maxTokens: 1000,
  temperature: 0.8,
  maxRetries: 3,
  text: ({ content, done, delta }) => {...},
  tools: {
    searchInternet: {
      description: 'Search the internet',
      parameters: z.object({
        query: z.string().describe('The query to search for'),
      }),
      generate: async function* ({ query }) {
        let searchingTextStream:
          | undefined
          | ReturnType<typeof createStreamableValue<string>>;
        let searchingTextNode: undefined | React.ReactNode;

        const searching = createStreamableUI(
          <InteractionLoading loadingContent="1/5-Searching Internet" />
        );
        const relevantResults = createStreamableUI(<RelevantCardList />);
        yield searching.value;

        const serperSearchResult = await serperGoogleSearch({ ... });

        const sourcesNumber = serperSearchResult.organic?.length || 0;

        searching.update(
          <InteractionLoading
            loadingContent={`2/5-Deep Analyzing ${sourcesNumber} sources`}
          />
        );

        // Update ai@3.2.14 will cause to stuck this step on Vercel Production Env.

        let serperSearchResponse;
        let relevantSourcesNumber = 0;
        const relevantLinkNumber = 5;

        if (searchMethod === 'speed') {
          serperSearchResponse = await speedResponseFromSearchResults(
            serperSearchResult,
            relevantLinkNumber
          );

          relevantSourcesNumber =
            serperSearchResponse.mostRelevantLinks.length;
          searching.update(
            <InteractionLoading
              loadingContent={`3/5-Found ${relevantSourcesNumber} most relevant sources`}
            />
          );
          relevantResults.done(
            <RelevantCardList searchResponse={serperSearchResponse} />
          );

          yield (
            <>
              {searching.value}
              {relevantResults.value}
            </>
          );
        } else {
          serperSearchResponse = await generateInitialResponse(
            serperSearchResult,
            relevantLinkNumber
          );

          relevantSourcesNumber =
            serperSearchResponse.mostRelevantLinks.length;
          searching.update(
            <InteractionLoading
              loadingContent={`3/5-Found ${relevantSourcesNumber} most relevant sources`}
            />
          );
          relevantResults.done(
            <RelevantCardList searchResponse={serperSearchResponse} />
          );

          yield (
            <>
              {searching.value}
              {relevantResults.value}
            </>
          );

          if (serperSearchResponse.scrapeWebData.length === 0) {
            searching.update(
              <InteractionLoading
                loadingContent={`4/5-Reading relevant sources`}
              />
            );
            serperSearchResponse = await aiScraperFromSearchResults(
              serperSearchResponse
            );
          }

          searching.update(
            <InteractionLoading loadingContent={`5/5-Getting answer`} />
          );
        }

        const searchBasedPrompt = generateAnswerFromSearchResponse(
          serperSearchResponse,
          query,
          searchMethod
        );

        const historyMessages = [
          ...messagesMatch.map((msg: CoreMessage) => ({
            role: msg.role,
            content: msg.content,
          })),
        ];

        const model = searchMethod === 'speed' ? 'gpt-3.5-turbo' : 'gpt-4o';
        const { textStream } = await streamText({
          model: openai(model),
          system: searchBasedPrompt,
          messages: historyMessages,
          maxTokens: 2000,
          temperature: 0.8,
        });

        if (!searchingTextStream) {
          searchingTextStream = createStreamableValue('');
          searchingTextNode = (
            <BotMessage content={searchingTextStream.value} />
          );
        }

        searching.done(<InteractionLoading loadingContent="Done" done />);

        yield (
          <>
            {searching.value}
            {relevantResults.value}
            {searchingTextNode}
          </>
        );

        for await (const text of textStream) {
          searchingTextStream.update(text);
        }
        searchingTextStream.done();

       ...

        return (
          <div>
            {relevantResults.value}
            {searchingTextNode}
            <SourcesList serperSearchResult={serperSearchResult} />
          </div>
        );
      },
    },
...

Additional context

The ideal solution would be to use the latest versions of all dependencies without encountering these issues.

lgrammel commented 1 week ago

Thanks for the bug report! Which Next.js version are you using?

steveoOn commented 1 week ago

Thanks for the bug report! Which Next.js version are you using?

I'm more than happy to offer further information if needed~

lgrammel commented 1 week ago

My suspicion is that the issue was caused by https://github.com/vercel/ai/pull/1825#issuecomment-2185983369 (which was required unfortunately, and a correct fix is currently blocked by nextjs/react capabilities).

The library downgrade issue is a side effect of introducing request headers, which required a change to the provider interfaces.

So unfortunately there is no solution, bc reverting would mean re-introducing the other bugs.

steveoOn commented 1 week ago

My suspicion is that the issue was caused by #1825 (comment) (which was required unfortunately, and a correct fix is currently blocked by nextjs/react capabilities).

The library downgrade issue is a side effect of introducing request headers, which required a change to the provider interfaces.

So unfortunately there is no solution, bc reverting would mean re-introducing the other bugs.

Does this mean that in the current latest version, we cannot use the createStreamableUI method to update UI within StreamUI? If so, is there any recommended workaround or alternative approach for updating the UI in real-time when using the latest version of the SDK in a production environment on Vercel Edge Runtime?

Vochsel commented 6 days ago

Not sure if it's the same cause, but experiencing the same symptoms.

However probably of note, I'm deploying via firebase-tools onto a Google cloud function.

I cannot use createStreamableUI or async generator functions to stream a tool response in a streamUI call after deploying. Things work perfectly fine with next dev. No error is reported in server logs.

This is still shown as valid in the docs https://sdk.vercel.ai/docs/ai-sdk-rsc/streaming-react-components

Also, downgrading ai package to 3.1.32 did not help for me.

froggy1014 commented 6 days ago

I think I have same issue, next dev working fine on local environment but somehow

generator function and initial Loading component from streamUI is not working on Vercel PRD

it did not work on vercel PRD and same branch on netlify works as below

ScreenRecording2024-07-04at2 28 17AM-ezgif com-video-to-gif-converter (1)

left: vercel / right : netlify

"@ai-sdk/openai": "^0.0.33",
"ai": "^3.2.7",

downgrade did not work tho.