vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
9.5k stars 1.39k forks source link

add tool roundtrips to streamUI #2848

Closed AWKohler closed 3 weeks ago

AWKohler commented 3 weeks ago

Feature Description

add tool roundtrips to streamUI. This would allow us to use agents alongside Generative UI

Use Case

No response

Additional context

No response

Godrules500 commented 3 weeks ago

@AWKohler It is currently being added streamText #2836 . Also, I just switched from streamUI to streamText. It may be worth trying that. It's not too hard of a transition.

AWKohler commented 3 weeks ago

That doesn't appear to have generative UI support. "initial" is not supported

Godrules500 commented 3 weeks ago

That doesn't appear to have generative UI support. "initial" is not supported

I'll post what I do later, but I make the stream UI, or stream text now, asynchronous and use reply = createStreamUi with my initial UI value.

I then use reply.update and .done to update the UI.

I actually I found that to update the UI quicker than the initial UI piece from stream UI

AWKohler commented 3 weeks ago

I'll post what I do later

That would be great!

Godrules500 commented 3 weeks ago

Let me know if this helps!

     // Used for just updating the text instead of rendering the full html
      let textStream: undefined | ReturnType<typeof createStreamableValue<string>> = createStreamableValue('')
      const reply = createStreamableUI(<BotMessage content={textStream.value} isFinal={false} index={index} />)
      streamText({
        temperature: 1,
        // topP: 0.95,
        model: setupVertex(useGrounding),
        system: system_prompt,
        messages: secureMessages,

        onFinish: async finishResult => {
          let text = finishResult.text
          textStream!.done()

          // boring unrelated stuff was here.

          // Closes the UI stream.
          reply.done(
            <BotMessage isFinal={true} index={index}>
              {text}
            </BotMessage>
          )
        }
      })
        .then(async x => {
          let text = ''
          for await (const streamedText of x.textStream) {
            text += streamedText || ''
            // The textStream was set up with content={textStream.value} of createStreamableUI
            textStream?.update(streamedText)
          }
          return text
        })
        .catch(async error => {
          let title = await titlePromise
          handleUnexpectedStreamingErrors(error, aiState, reply, true, title)
        })

      return {
        id,
        display: reply.value
      }
lgrammel commented 3 weeks ago

Duplicates #1895

AWKohler commented 1 week ago

Let me know if this helps!

     // Used for just updating the text instead of rendering the full html
      let textStream: undefined | ReturnType<typeof createStreamableValue<string>> = createStreamableValue('')
      const reply = createStreamableUI(<BotMessage content={textStream.value} isFinal={false} index={index} />)
      streamText({
        temperature: 1,
        // topP: 0.95,
        model: setupVertex(useGrounding),
        system: system_prompt,
        messages: secureMessages,

        onFinish: async finishResult => {
          let text = finishResult.text
          textStream!.done()

          // boring unrelated stuff was here.

          // Closes the UI stream.
          reply.done(
            <BotMessage isFinal={true} index={index}>
              {text}
            </BotMessage>
          )
        }
      })
        .then(async x => {
          let text = ''
          for await (const streamedText of x.textStream) {
            text += streamedText || ''
            // The textStream was set up with content={textStream.value} of createStreamableUI
            textStream?.update(streamedText)
          }
          return text
        })
        .catch(async error => {
          let title = await titlePromise
          handleUnexpectedStreamingErrors(error, aiState, reply, true, title)
        })

      return {
        id,
        display: reply.value
      }

@Godrules500 This is great and works well! There is one issue I am having however. I use a spinner component as my initial message component (the placeholder before a response is given). If I do this with your approach ( createStreamableUI(<Spinner ) Then for what ever reason token streaming no longer works, and it responds as if streaming was disabled. If I use an empty message as the initial placeholder, then everything works. Did you encounter this, and if so what did you do?