QuivrHQ / quivr

Opiniated RAG for integrating GenAI in your apps 🧠 Focus on your product rather than the RAG. Easy integration in existing products with customisation! Any LLM: GPT4, Groq, Llama. Any Vectorstore: PGVector, Faiss. Any Files. Anyway you want.
https://core.quivr.com
Other
36.75k stars 3.59k forks source link

[Bug]: Can't use local model with ollama #3425

Open LronDC opened 1 month ago

LronDC commented 1 month ago

What happened?

can't find .env.example file which in quickstart.md and also can't find a guide to use local model

Relevant log output

No response

Twitter / LinkedIn details

No response

linear[bot] commented 1 month ago

CORE-257 [Bug]: Can't use local model with ollama

iemafzalhassan commented 1 month ago

Hi 👋

I would like to work on this issue...