vercel / ai

Build AI-powered applications with React, Svelte, Vue, and Solid
https://sdk.vercel.ai/docs
Other
9.39k stars 1.37k forks source link

Adding messages to useChat without streaming #415

Open codasana opened 1 year ago

codasana commented 1 year ago

In my chatbot, I'm using the useChat hook to display streaming messages.

While this is how I want (stream) most of the time, I have a certain use-case of manipulating the openai response message before it gets sent to the frontend. In this scenario, I don't want the streaming, but direct adding of the message

It's a Next.js app. Here's my setup:

Frontend:

const { messages, input, handleInputChange, handleSubmit } = useChat()

{A map on messages}

API (/api/chat)

export async function POST(req: Request) {
  const { messages } = await req.json()

  // Ask OpenAI for a streaming chat completion given the prompt
  const response = await openai.createChatCompletion({
    model: 'gpt-3.5-turbo',
    stream: true,
    messages
  })
  // Convert the response into a friendly text-stream
  const stream = OpenAIStream(response)
  // Respond with the stream
  return new StreamingTextResponse(stream)
}

I'm returning the stream with StreamingTextResponse(stream).

In a certain use-case (like if body object contains a flag), how can I send a message to messages in useChat() directly without streaming it?

tayzlor commented 1 year ago

Unsure if this fully fits your use case, but you could potentially use the setMessages() function from the Chat Helpers to update the local message state without triggering an API call to the server?

However, depending on how you're handling your API endpoint the message may be sent/received by the server on the next request.

Otherwise you could just handle your specific need when you render the local messages object out to the screen (checking for your flag and adding an additional message/piece of content in the UI)

RobertHH-IS commented 1 year ago

If you ever need to intercept the stream or send a direct message, you need to create a new stream to return. For example...

const stream = OpenAIStream(res, {
    experimental_onFunctionCall: async (
      { name, arguments: args },
      createFunctionCallMessages: any
    ) => {
      //stuff
    },
}
async onCompletion(completion) {
      //stuff
  })
  const reader = stream.getReader()
  async function* generateStreamData() {
   //manipulate stream as needed.
  }
  const newStream = new ReadableStream({
    start(controller) {
      ;(async () => {
        for await (let chunk of generateStreamData()) {
          controller.enqueue(chunk)
        }
        controller.close()
      })()
    }
  })
  return new StreamingTextResponse(newStream)
}

Even though streaming looks nice - development is absolutely more difficult. Parsing JSON on completion is much easier than trying to parse and manipulate streams. Doing "fake stream" after completion may be much easier as done in following code from langchain example

    const executor = await initializeAgentExecutorWithOptions(tools, chat, {
      agentType: "openai-functions",
      verbose: true,
      returnIntermediateSteps,
      memory: new BufferMemory({
        memoryKey: "chat_history",
        chatHistory: new ChatMessageHistory(previousMessages),
        returnMessages: true,
        outputKey: "output",
      }),
      agentArgs: {
        prefix: TEMPLATE,
      },
    });

    const result = await executor.call({
      input: currentMessageContent,
    });

    // Intermediate steps are too complex to stream
    if (returnIntermediateSteps) {
      return NextResponse.json(
        { output: result.output, intermediate_steps: result.intermediateSteps },
        { status: 200 },
      );
    } else {
      /*
       * Agent executors don't support streaming responses (yet!), so stream back the
       * complete response one character at a time with a delay to simluate it.
       */
      const textEncoder = new TextEncoder();
      const fakeStream = new ReadableStream({
        async start(controller) {
          for (const character of result.output) {
            controller.enqueue(textEncoder.encode(character));
            await new Promise((resolve) => setTimeout(resolve, 20));
          }
          controller.close();
        },
      });

      return new StreamingTextResponse(fakeStream);
    }
  } catch (e: any) {
    return NextResponse.json({ error: e.message }, { status: 500 });
  }
}