pymike00 / tinychat

🔮 TinyChat is a lightweight Desktop client for modern Language Models designed for straightforward comprehension. Supports OpenAI, Anthropic, Meta, Mistral, Google and Cohere APIs.
MIT License
40 stars 7 forks source link

Support for Ollama API #4

Open imbev opened 10 months ago

imbev commented 10 months ago

Ollama is an open source application that makes it very easy to use LLMs via CLI or an HTTP API. I suggest adding support for Ollama's API.

https://github.com/jmorganca/ollama/blob/main/docs/api.md

pymike00 commented 10 months ago

Hi there @imbev !

Thanks for your input!! Yes, support for local models is definitely something that I would like to implement in future releases, right after chat history.

Regarding Ollama API is that still WSL only under Windows? What system do you use it on?

Cheers!!

imbev commented 10 months ago

Hello @pymike00

I currently use Ollama on Debian Linux.

It can be compiled on Windows, but the Windows version is definitely not ready yet. https://github.com/jmorganca/ollama/blob/main/docs/development.md

pymike00 commented 10 months ago

I think in that case it's probably better to wait for proper Windows support before adding some code for it.

After all the main value of the project lies in its simplicity, I think.

Thank you for your feedback, it's much appreciated.

Feel free to post any other suggestion you may have.

Happy Coding!