OvidijusParsiunas / deep-chat

Fully customizable AI chatbot component for your website
https://deepchat.dev
MIT License
1.27k stars 175 forks source link

chat history manipulation #92

Closed linqingfan closed 5 months ago

linqingfan commented 5 months ago

Is it possible to manipulate the chat history directly? I m trying to implement "continue" feature. This is because when the max tokens is reached, the LLM will return partial answer. Need to implement continue feature so that the full answer can be concatenated to the last character in the last ai answer.

OvidijusParsiunas commented 5 months ago

Just to make sure I understand this correctly: when the chat retrieves a response, you want to have a "continue" feature to get another response from the server which would contain the rest of the response? If that is the case, then I would recommend using the handler function which will allow you to manipulate the response just the way you want it and make as many calls as you need before displaying the result in the chat.

If you are concerned about token limitations, I would recommend the use of the requestBodyLimits property which will automatically limit what is sent to the server. Additionally, you can use the requestInterceptor to manipulate what is sent to the server.

If you are looking for something else, please provide more information about your use-case.

Thankyou!

linqingfan commented 5 months ago

Just to make sure I understand this correctly: when the chat retrieves a response, you want to have a "continue" feature to get another response from the server which would contain the rest of the response? If that is the case, then I would recommend using the handler function which will allow you to manipulate the response just the way you want it and make as many calls as you need before displaying the result in the chat.

If you are concerned about token limitations, I would recommend the use of the requestBodyLimits property which will automatically limit what is sent to the server. Additionally, you can use the requestInterceptor to manipulate what is sent to the server.

If you are looking for something else, please provide more information about your use-case.

Thankyou!

Screenshot from 2024-01-08 11-43-40 Yes, from the screenshot shown, when the user click at the button, the intention is to clear the button and continue from the message. It MUST continue from "Grease" in the e.g. because the whole answer is in markdown table format, if not the format will be broken.

I notice there is a submit new message method but how to access the chat history and send it to the LLM to let it continue? If we can directly access the chat history during request or response, it will be easy. Increasing the max tokens (the LLM output limit) were not used intentionally. The e.g shown has max tokens of 30 for demonstration purpose. For practical use, max tokens of 250 is commonly used and long answer of more than 500 words are quite common also. This is better user experience to have the option to skip to other question instead of waiting for long complete answer especially in slow compute system.

BTW, to let LLM continue the answer, the chat history need to be sent to the LLM. If during request or response method we have accessed to the chat history and are able to modify the content during response, that will be great. whatever things modified will be shown on the chat instead of only having clearMessage method.

OvidijusParsiunas commented 5 months ago

In order to send the full chat history to the server, set the maxMessages property in requestBodyLimits to 0.

There is no real way to manipulate the chat history, so if you want to remove the previous message - I can suggest the use of the overwrite property in your Response messages.

The use of a continue button is not easy to do as the submitUserMessage method expects text as its argument, therefore the chat will display a new user message every time the method is called.

I probably have three suggestions that would work best for you which you can explore:

I will try to help as much as I can, but your use-case appears to be very unique, hence the configuration is inherently complicated. Let me know if you need other help.

linqingfan commented 5 months ago

As I m new to javascript/web programming, I try to use easier way. Managed to manipulate the chathistory by using the combination of getMessages, clearMessages and addMessage. However, need to use older version in order to use addMessage. Thanks for the advice!

OvidijusParsiunas commented 5 months ago

The addMessage method is available in the newest Deep Chat versions, except it is called _addMessage. The reasons why it was removed from the main API are explained in this thread.

May I ask you how you are using it, or why do you find it useful as we may include it in the main API in the future. Thanks!

linqingfan commented 5 months ago

For the button image in my previous post, I can delete the button and continue from the message by intercepting the response and calling the following code before returning the response:

    function clearbtn(){
      const elementRef = document.getElementById("chat-element");
      chathistory=elementRef.getMessages() # store chat history
      elementRef.clearMessages();
      chathistory.slice(0,-2).map(elementRef.addMessage) //re-print chat history but exclude the previous ans with button
    }

The response text consists of concatenated answer. In this way, I can manipulate the chat history any way I want. Not as efficient if I can access to the chat history directly.

OvidijusParsiunas commented 5 months ago

Ohhhhh I see, that is very clever!

You could also potentially be able to reinstantiate the message history by setting the initialMessages property. E.g. elementRef.initialMessages = chatHistory;. This will cause a full re-render of the chat which may not be what you want, but certainly can be used as an alternative.

linqingfan commented 5 months ago

Thanks again for the excellent work!