ivanfioravanti / chatbot-ollama

Chatbot Ollama is an open source chat UI for Ollama.
Other
1.33k stars 217 forks source link

complete docker-compose.yml #28

Closed joecryptotoo closed 6 months ago

joecryptotoo commented 7 months ago

This will start up everything you need to get going.

SaundersB commented 7 months ago

After further testing on an M3 Max I found Ollama runs extremely slow in Docker vs running native on my local machine. However, it may still provide use to others to have Ollama running with very little external configuration.

joecryptotoo commented 7 months ago

After further testing on an M3 Max I found Ollama runs extremely slow in Docker vs running native on my local machine. However, it may still provide use to others to have Ollama running with very little external configuration.

Running this in docker on a Mac won't give it access to the GPU so it was running on CPU only. I'm running this on an HP server with an Nividia RTX4090 connected to it.

ivanfioravanti commented 6 months ago

This is great, but I think it's better to keep Ollama out of docker, because on Mac it's not using GPU as both of you @joecryptotoo and @SaundersB confirmed.