Purg / ollama-compose

Docker compose around Ollama and Open-WebUI
Apache License 2.0
2 stars 2 forks source link

Have a version that works for mac environments #1

Open jeffbaumes opened 1 month ago

jeffbaumes commented 1 month ago

My mac doesn't like the required Nvidia device allocation, so I needed to comment it out. Since docker-compose overrides don't really support removing things, I'm not sure what a good way to make this optional is. I was able to get llama3:8b working reasonably (a few words a second) on my macbook pro without this deploy section.

index cf24abd..f97c4b8 100644
--- a/docker-compose.yaml
+++ b/docker-compose.yaml
@@ -9,13 +9,13 @@ services:
     #ports:
     #  # Standard port for Ollama. Don't really want to expose this unencrypted.
     #  - 127.0.0.1:11434:11434
-    deploy:
-      resources:
-        reservations:
-          devices:
-            - capabilities: [gpu]
-              driver: nvidia
-              count: all
+    # deploy:
+    #   resources:
+    #     reservations:
+    #       devices:
+    #         - capabilities: [gpu]
+    #           driver: nvidia
+    #           count: all
     restart: unless-stopped

Also, maybe consider some additional env variables for setting the http/s ports. My mac doesn't like me exposing to privileged ports like 443 and 80:

@@ -44,8 +44,8 @@ services:
       - ./nginx/dhparam.pem:/etc/ssl/certs/dhparam.pem:ro
       - ./nginx/nginx.conf:/etc/nginx/conf.d/default.conf:ro
     ports:
-      - 127.0.0.1:443:443
-      - 127.0.0.1:80:80
+      - 127.0.0.1:${NGINX_HTTPS_PORT:-443}:443
+      - 127.0.0.1:${NGINX_HTTP_PORT:-80}:80
     restart: unless-stopped
Purg commented 1 month ago

I know how to layer in settings like the GPU settings. I just didn't get around to that in the first pass. Great idea though, and reminder here as an issue, to factor that out in a future pass.