Open imbev opened 10 months ago
Hi there @imbev !
Thanks for your input!! Yes, support for local models is definitely something that I would like to implement in future releases, right after chat history.
Regarding Ollama API is that still WSL only under Windows? What system do you use it on?
Cheers!!
Hello @pymike00
I currently use Ollama on Debian Linux.
It can be compiled on Windows, but the Windows version is definitely not ready yet. https://github.com/jmorganca/ollama/blob/main/docs/development.md
I think in that case it's probably better to wait for proper Windows support before adding some code for it.
After all the main value of the project lies in its simplicity, I think.
Thank you for your feedback, it's much appreciated.
Feel free to post any other suggestion you may have.
Happy Coding!
Ollama is an open source application that makes it very easy to use LLMs via CLI or an HTTP API. I suggest adding support for Ollama's API.
https://github.com/jmorganca/ollama/blob/main/docs/api.md