I setup nomic-embed-text ollama locally, in librechat I can upload files and make queries while chatting with openai, it works fine, but it does not work in the assistant section, although it says upload success in the console, it does not answer the query correctly.
Need help to integrate my local embedding and vectordb setup to work with Assistant.
More Details
I shared 2 screenshots for better understanding.
Image from chat(which is successfully can query pdfs):
Image from assistant(ı can upload files but not query them)
What is the main subject of your question?
Installation
Screenshots
No response
Code of Conduct
[X] I agree to follow this project's Code of Conduct
What is your question?
I setup nomic-embed-text ollama locally, in librechat I can upload files and make queries while chatting with openai, it works fine, but it does not work in the assistant section, although it says upload success in the console, it does not answer the query correctly.
Need help to integrate my local embedding and vectordb setup to work with Assistant.
More Details
I shared 2 screenshots for better understanding.
Image from chat(which is successfully can query pdfs):
Image from assistant(ı can upload files but not query them)
What is the main subject of your question?
Installation
Screenshots
No response
Code of Conduct