Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
https://useanything.com
MIT License
15.35k stars 1.6k forks source link

[BUG]: Discontinuous conversation #1442

Open chalitbkb opened 2 weeks ago

chalitbkb commented 2 weeks ago

How are you running AnythingLLM?

All versions

What happened?

Some issues were found: The conversation is not aligned or continuous with the previous dialogue. It seems this might be related to compiling the conversation history for the model. I checked the knowledge base, and it should provide answers based on the knowledge I have gathered.

*The first question was answered correctly, but the second question does not align with the answer to the first.

image

image

image

Are there known steps to reproduce?

No response

man2004 commented 2 weeks ago

Seems you have selected query mode? I think if you want to have previous dialogue, you need to choose conversation mode.

chalitbkb commented 2 weeks ago

Seems you have selected query mode? I think if you want to have previous dialogue, you need to choose conversation mode.

Using the "query" mode should allow for asking related questions continuously to maintain coherence. I understand that this mode queries information solely from the knowledge base. If no information is found, the response will attempt to decline and inform the user that no data was found. However, what use is it if it cannot learn from previous questions? Therefore, this mode should be improved. For example, in the attached image, I discussed the same topic but it seems that the conversation did not integrate the first question with the second one, even though they are on the same subject and asked consecutively.

Thus, in this mode, rejection will occur if no information is found as usual. But there is a condition that previous conversation data must be combined with new questions to ensure coherence about what is being discussed. Rejection should only happen after checking from the conversation that when combined with the latest question, no information can truly be found in the knowledge base. Only then should it respond to users indicating that no data was found. On the other hand, choosing "Chat" mode will attempt to find other information beyond predefined knowledge and will not reject providing any information. This would make more sense.

Propheticus commented 2 weeks ago

Currently, in query mode, no historical messages are sent. Each query is handled in isolation.

It sound like you're looking for a feature enhancement where there's a hybrid form between query and chat modes. e.g. Query mode for the first prompt and then chat mode for subsequent messages.