Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
25.29k stars 2.56k forks source link

[BUG]: In Query mode, will there still be an output of answers if the content of the query cannot be found in the knowledge base? #740

Closed huicewang closed 8 months ago

huicewang commented 8 months ago

How are you running AnythingLLM?

Local development

What happened?

When the chat mode is set to Query mode, responses are generated even if the content of the question is completely unrelated to the knowledge base. However, the chat mode settings mention: "Query will provide answers only if document context is found." This implies that answers should only be given when the document context is identified. Is there a contradiction here?

image

Are there known steps to reproduce?

No response

huicewang commented 8 months ago

When displaying the cited documents, can the similarity of the documents be shown, or is there a sorting by similarity, otherwise the displayed content has a very low similarity and essentially no reference value?

timothycarambat commented 8 months ago

That context means that the RAG process determined that is was semantically similar (not that it was useful! It does not do an evaluation, only similarity). So it is possible to get context that is fundamentally useless for your question but did somehow get something from the vector db.

We can add the score to each chunk though - which I think would be useful for debugging - which again is only the score of its semantic similarity from the vector db, not that it will answer your question fully.