PromtEngineer / localGPT

Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
Apache License 2.0
19.65k stars 2.2k forks source link

Adding chat history gives wrong answer #635

Open pvtoan opened 8 months ago

pvtoan commented 8 months ago

Hi,

I'm doing tests when using chat history (ConversationBufferWindowMemory) + local data retrieval + llm (Baichuan2-13B-Chat) to get answer.

I have two tests. Note that, all questions are related to the local data I feed to llm.

  1. Without chat history ConversationBufferWindowMemory with k = 0, the answer is quite ok. It means, llm can gives, at least, related information from my question.

  2. With chat history ConversationBufferWindowMemory with k = 4, when I pass the next question, really "different" from previous question, but I got the same answer as the previous question. It happens several times.

Anyone has the same problem with me and any solution I can fix that?

PromtEngineer commented 8 months ago

Probably we need to look at how to deal with history, if the conversation is ongoing, the history might be irrelevant to the question but its being used to generate an answer. Probably need to use summary of the previous conversations.

pvtoan commented 8 months ago

Yeah, it is a potential way to try.

LangChain also supports a function to do text compression by using LLMs

Thank you!