OvidijusParsiunas / deep-chat

Fully customizable AI chatbot component for your website
https://deepchat.dev
MIT License
1.43k stars 218 forks source link

Handling multiple message responses? #83

Closed ckcollab closed 9 months ago

ckcollab commented 9 months ago

Hey there, thanks so much for releasing this -- much appreciated!

I'm having a bit of trouble rendering multiple message responses I'm receiving from my backend.

Here's my Vue template code:

              <deep-chat
                :demo="true"
                :initialMessages="thread.messages"
                :request="{
                  handler: sendMessage
                }"
              />

And here's my sendMessage helper function, which attempts to fire off multiple signals:

const sendMessage = async (body, signals) => {
  try {
    const response = await useRequest(`/threads/${thread.value.id}/message/`, {
      method: 'POST',
      body: {
        message: body.messages[0].text,
      }
    })

    response.forEach((message) => {
      signals.onResponse(message)
    })
  } catch (e) {
    useErrorHandler(e)
  }
}

This only ever prints the first message, I assume signals.onResponse is only ever meant to be called once? It's not apparent to me how to make this call multiple times.. probably missing something very simple, appreciate any help, thanks!

OvidijusParsiunas commented 9 months ago

Hi @ckcollab.

You are on the right track. By default - the handler function is intended to handle one text response per one request. If you are exercising more dynamic behaviour where you want to return multiple response text messages, I would instead recommend using the websocket handler instead. All you'll need to do is set the websocket property to true and then add signals.onOpen(); on the start of the handler as it is triggered when the component loads up. There is a good example in the Websocket tab of the handler function documentation. You can change it to something like this:

// Vue component
:request="{
  websocket: true,
  handler: sendMessage
}"

// sendMessage variable
const sendMessage = async (body, signals) => {
  try {
    signals.onOpen(); // enables the user to send messages
    const response = await useRequest(`/threads/${thread.value.id}/message/`, {
      method: 'POST',
      body: {
        message: body.messages[0].text,
      }
    })
    response.forEach((message) => {
      signals.onResponse(message); // displays a text message from the server
    })
  } catch (e) {
    useErrorHandler(e)
  }
}

Let me know if this helps.

ckcollab commented 9 months ago

Thanks for the blazing fast response!

Here's my latest attempt, which appears to mostly work, but I don't seem to make the DeepChat component enter a "waiting for response" state with my send message any more? I'm sure there has to be a simple call to make that enables the "display loading bubble?"

// Component..
<deep-chat
  :demo="true"
  :initialMessages="thread.messages"
  :request="{
    websocket: true,
    handler: chatEventHandler
  }"
/>

// Handler..
const chatEventHandler = async (_, signals) => {
  signals.onOpen(); // enables the user to send messages, allows us to handle multiple msgs "websocket style"

  signals.newUserMessage.listener = async (body) => {
    try {
      const response = await useRequest(`/threads/${thread.value.id}/message/`, {
        method: 'POST',
        body: {
          message: body.messages[0].text,
        }
      })

      response.forEach((message) => {
        signals.onResponse(message)
      })
    } catch (e) {
      useErrorHandler(e)
    }
  };
}
OvidijusParsiunas commented 9 months ago

That's a good point. When I originally designed the interface for websockets I did not consider the need for a loading bubble as the intent was for messages to be fully asynchronous (meaning that the user could send multiple messages without the need to wait/load for a response from the server). Having looked at the code, there are two core ways that you can go about this:

  1. You can create your own custom loading message:

You can simply add the signals.onResponse({text: 'Loading...'}); message when you are waiting for a response from the server and then on your first signal containing the response from the server, add the overwrite property to overwrite the loading message. e.g.

signals.onResponse({text: responseText, overwrite: true});

You can also use the disableSubmitButton method to prevent the user from being able to send messages.

If you want the same loading animation bubble as the native deep chat loading message bubble, it is a little bit more tricky. You will first have to use the html property in your response messages instead of text, and the loading message will have to be:

signals.onResponse({html: '<div class="loading-message-text custom-loading"><div class="dots-flashing"></div></div>'});

Then you will need to set htmlClassUtilities with:

{
  'custom-loading': {
    styles: {
      default: {
        padding: '0.18em 0.1em 0.1em 0.6em',
      },
    },
  },
}

Finally, for the animation bubbles to have color, you will need to set the following style properties in your project's css style:

:root {
  --message-dots-color: #848484;
  --message-dots-color-fade: #55555533;
}

The above properties can ofcourse be further customised to suit your preferences.

  1. You can go back to using your original implementation, but instead of responding with text you can respond with html which will allow you to customise the response message into multiple smaller ones just the way you need it:

Change the response code to something like this:

const responseHTML = response.map((message) => {
  return `<div class="custom-text-message">${message.text}</div> `;
})
signals.onResponse(responseHTML);

In Deep Chat, you will also need to change the following:

Set messageStyles:

{
  html: {
    ai: {
      bubble: {backgroundColor: 'white', margin: '0', padding: '0'},
    },
  },
}

Set htmlClassUtilities:

{
  'custom-text-message': {
    styles: {
      default: {
        backgroundColor: '#e4e6eb',
        borderRadius: '10px',
        padding: '0.42em 0.55em',
        marginTop: '10px',
      },
    },
  },
}

Ofcourse you can change everything here to your preference.

Let me know if you need any further assistance. Thanks!

ckcollab commented 9 months ago

Thanks so much for the great support, this is already working quite well, much appreciated!

OvidijusParsiunas commented 9 months ago

Happy to hear it worked for you @ckcollab!