FredrikOseberg / react-chatbot-kit

MIT License
297 stars 139 forks source link

How to run it for OpenAI response streaming #177

Open p2991459 opened 6 months ago

OliverThomas2000 commented 5 months ago

This is how my setup works. I use a streaming response from FastAPI:

I have this function in my action provider for updating an existing message

const updateLastMessage = (message) => {
    setState((prev) => {
      return { ...prev, messages: [...prev.messages.slice(0, -1), { ...prev.messages.at(-1), message }]};
    });
  };

I then use this inside the action:

    let done, value;
    let messageBuffer = "";
    let decoder = new TextDecoder("utf-8");
    addMessageToState(createChatBotMessage("streaming...")) //You need a dummy message to update
    while (!done) {
      ({ done, value } = await reader.read());
      messageBuffer += decoder.decode(value);
      updateLastMessage(messageBuffer)
    }

Bear in mind that you can also manipulate the delay on createChatBotMessage - it's default is 750ms - if you don't want any loading animation at all you can set this to a negative value (delay:-750). As far as I know this is safe to do.