snatvb / ollama-chat

MIT License
0 stars 0 forks source link

Ollama Chat App 🐐

Welcome to my Ollama Chat, this is an interface for the Official ollama CLI to make it easier to chat. It includes futures such as:



Recognizing

For use it requires to pull llava.

Use this command:

ollama pull llava:13b

After open Settings & Info and choose vision model

How to build on your machine

Requirements


You as well you need to install Ollama and after you installed it, you can run your local server with this command OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve.

Thanks

This is fork of Twan Luttik. Thanks for first implementation.