Open k2an opened 3 months ago
We're definitely interested in adding Ollama support to this project. Thanks for opening this issue.
I'm also looking forward to this feature! ✨
👀
Yeah, it will would great to support ollama, LM studio, llama.cpp and more well-known opensource LLMs, like MiniCPM for vision.
I love your project, I want to use it with local ollama+llava and i tried many way including asking chat gpt. I am on Windows 11, i tried docker and no go. changed api address from settings in frontend also
and i tested my local ollama+llava answering and running with postman.
changed frontend\src\lib\models.ts
also backend\llm.py
Actual model versions that are passed to the LLMs and stored in our logs
console and backend errors below
If can be use on local server it'll be awesome! Thanks for consideration