Welcome to my Ollama Chat, this is an interface for the Official ollama CLI to make it easier to chat. It includes futures such as:
For use it requires to pull llava.
Use this command:
ollama pull llava:13b
After open Settings & Info and choose vision model
You as well you need to install Ollama and after you installed it, you can run your local server with this command OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve
.
This is fork of Twan Luttik. Thanks for first implementation.