valiantlynx / ollama-docker

Welcome to the Ollama Docker Compose Setup! This project simplifies the deployment of Ollama using Docker Compose, making it easy to run Ollama with all its dependencies in a containerized environment
https://ollama-docker.azurewebsites.net/
Other
455 stars 93 forks source link

Changing the 11434 as the system port default #8

Closed sandeepzgk closed 3 months ago

sandeepzgk commented 4 months ago

The use of port 11434 conflicts with the default local installation of ollama, it might be better to map the container to a different port so that it does not conflict with any existing ollama installation

To Reproduce Steps to reproduce the behavior:

  1. Install Ollama on the base computer
  2. Start docker compose for ollama-docker

Expected behavior It should work as expected. Instead it gives a port error given that port 11434 is used on the system by ollama

Suggested Fix Map a different port in docker-compose (gpu and non gpu version) with

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

valiantlynx commented 4 months ago

i agree. i didnt think about people who already a have an installation of ollama. ill do that

valiantlynx commented 4 months ago

ive thought about it and if someone already has ollama running, they could just comment out ollama. another option would be to have a bash script that asks if you have ollama already and if you have a gpu?

it doesnt make to have two ollama running, and it doesnt affect the ollama installation if you already have it just uses it. what do you think?

valiantlynx commented 3 months ago

now it should be easy as you can have multiple ollama hosts and it will be load balanced if you want