Open aliabbasjp opened 1 month ago
Hey @aliabbasjp the RAG agent is a multi-turn LLM agent, which can choose to answer a query with or without a search. It makes a tool call to conduct the search.
Question and answer always does a search, and is single-turn.
In #1104 you mention that you're running a local LLM—in the sidebar to the left here you need to specify the model. Right now it defaults to OpenAI(which, as you noted, you're not using, so it's throwing a 500 error.) This is going to be updated in our next release such that we don't set the model on the UI by default.
This question is related to #1104, so I'll keep this one open and close the other. Let me know if this solves it for you!
Thanks, so how can i configure the tooling the rag agent uses? Question Answer is using default open ai?
You can select the model on the sidebar, like so:
We don't currently have a way to modify and add custom tools, but we hope to add this in soon.
i am unable to run qna from UI. its throwing network error
I see when i use question answer it defaults to using openAI even if other config is configured to use only localllm and i see no way to configure it. It errors out in the logs showing unable to connect to openai