Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
https://anythingllm.com
MIT License
20.07k stars 2.14k forks source link

[FEAT]: default q&a #736

Closed Mirgiacomo closed 3 months ago

Mirgiacomo commented 6 months ago

What would you like to see?

It would be very helpful to have some kind of q&a. That is, a section where, I write a hypothetical question and also write the answer that the chatbot should give.

Like: User : "where can I contact you?" Chatbot : "You can write an email to info@chatbot.com"

I saw that there is already a similar section but it only allows you to create predefined questions (and not also answers)

timothycarambat commented 6 months ago

These would be workspace-related though, not system-wide? I ask because you could do this in the "Default messages" in system>appearance settings.

Mirgiacomo commented 6 months ago

Yes, in the workspace.

I saw that there is the "Default Messages" section, but it doesn't really do the same thing; in fact, in this section you can only set default messages to be sent in the chat but not the default answer.

I don't know if I've been clear enough

timothycarambat commented 6 months ago

No, you have been clear - you want to just have it "auto-respond" and send nothing to the LLM at all. Basically fake a response?

Mirgiacomo commented 6 months ago

Essentially, yes. As a first step one could give a "fake" answer without going through the LLM.

an additional level of utlity would be to be able to intercept the same question but asked in other ways (but for this i think we will need LLM helps).

That is, if I define that when the user asks "how can I contact you?" the chatbot responds "you can write an email to example@test.com!" It would be interesting that the same response is also given when the question from the user changes slightly e.g. "hello, how do I contact you?" or "contacts info" or "I would like to contact you" and should associate all these messages very similar to a "general" one and then always give the same answer, in this case "you can write an email to example@test.com!"

timothycarambat commented 3 months ago

Closing this as wontfix as this would be very use-case specific for a lot of people and is going to be out of scope for AnythingLLM, which primarily is for internal use within an organization. We offer the chat embed because its so widely asked for, but its not core to our offering so we won't be building out more use-cases specific to that functionality for the foreseeable future.