wandb / openui

OpenUI let's you describe UI using your imagination, then see it rendered live.
https://openui.fly.dev
Apache License 2.0
16.97k stars 1.49k forks source link

proxy application via nginx #40

Open phoedos opened 3 months ago

phoedos commented 3 months ago

got idea to use your project to generate mockups. thus I had to move out-service from local PC to VPS

made nginx conf like

map $http_upgrade $connection_upgrade {
  default upgrade;
  '' close;
}

server {
     listen 80;
     server_name  openui.myserver.com;
     access_log /var/log/nginx/access_openui.log;
     error_log  /var/log/nginx/error_openui.log;
     rewrite     ^   https://$server_name$request_uri? permanent;
}

server {
        listen 443 ssl http2;
        server_name  openui.myserver.conf;
        access_log /var/log/nginx/access_openui.log;
        error_log  /var/log/nginx/error_openui.log;
        ssl_certificate /etc/letsencrypt/live/myserver.com/fullchain.pem;
        ssl_certificate_key /etc/letsencrypt/live/myserver.com/privkey.pem;
        include ssl.conf;
        include allow_confs/allow_*.conf;
        deny all;

    location / {
        proxy_pass http://127.0.0.1:7000;
        proxy_redirect     default;
        proxy_http_version 1.1;

        proxy_set_header   Connection        $connection_upgrade;
        proxy_set_header   Upgrade           $http_upgrade;

        proxy_set_header   Host              $http_host;
        proxy_set_header   X-Real-IP         $remote_addr;
        proxy_set_header   X-Forwarded-For   $proxy_add_x_forwarded_for;
        proxy_set_header   X-Forwarded-Proto $scheme;
        proxy_max_temp_file_size 0;
    }
}

and docker-compose.yml 
version: '3.7'

services:

  openui:
    image: wandb/openui:latest
    hostname: openui
    container_name: openui
    environment:
      OPENAI_API_KEY: 'sk-****'
    ports:
      - "127.0.0.1:7000:7878"
    networks:
      - openui

networks:
  openui:
    driver: bridge

Application is opened in browser on address https://myserver.com but face with hardcode of requests, example: http://127.0.0.1:7878/openui/index.html?buster=113 NS_CONNECTION_REFUSED tags request is respond with HTTP 500 https://myserver.com/v1/ollama/tags 500

thus page is displayed in browser via real dns, requests are passing, generated html code is shown but nothing shown on design visualization page.

vanpelt commented 3 months ago

If you set OPENUI_HOST=https://myserver.com in your docker-compose.yaml, that should fix it!

The ollama/tags will error as long as you don't expose a an ollama service to the container and that's fine if you're specifying an OPENAI_API_KEY.

Let me know if you're still having issues.

vanpelt commented 3 months ago

BTW I just added a docker-compose to the repo that shows how to expose Ollama should you want to.

phoedos commented 3 months ago

Dear @vanpelt, got your compose. Unfortunately OPENUI_HOST do not help. Evidence that is specified:

root@phoed147095:/opt/openui# docker ps
CONTAINER ID   IMAGE                  COMMAND               CREATED         STATUS         PORTS                      NAMES
2a7266ead7d4   wandb/openui:latest    "python -m openui"    3 minutes ago   Up 3 minutes   127.0.0.1:7000->7878/tcp   openui_backend_1
7ab43093834f   ollama/ollama:latest   "/bin/ollama serve"   3 minutes ago   Up 3 minutes   11434/tcp                  openui_ollama_1
root@phoed147095:/opt/openui# docker exec -it openui_backend_1 /bin/bash
root@2a7266ead7d4:/app# echo $OPENUI_HOST
https://myserver.com

container logs show nothing interesting

Creating network "openui_default" with the default driver
Creating openui_ollama_1 ... done
Creating openui_backend_1 ... done
Attaching to openui_ollama_1, openui_backend_1
ollama_1   | time=2024-04-05T09:16:14.777Z level=INFO source=images.go:804 msg="total blobs: 0"
ollama_1   | time=2024-04-05T09:16:14.782Z level=INFO source=images.go:811 msg="total unused blobs removed: 0"
ollama_1   | time=2024-04-05T09:16:14.782Z level=INFO source=routes.go:1118 msg="Listening on [::]:11434 (version 0.1.30)"
ollama_1   | time=2024-04-05T09:16:14.808Z level=INFO source=payload_common.go:113 msg="Extracting dynamic libraries to /tmp/ollama1007413018/runners ..."
backend_1  | wandb: Unpatching OpenAI completions
backend_1  | INFO (openui):  Starting OpenUI AI Server created by W&B...
backend_1  | INFO (openui):  Running API Server
backend_1  | INFO (uvicorn.error):  Started server process [1]
backend_1  | INFO (uvicorn.error):  Waiting for application startup.
backend_1  | DEBUG (openui):  Starting up server in 1...
backend_1  | INFO (uvicorn.error):  Application startup complete.
backend_1  | INFO (uvicorn.error):  Uvicorn running on http://0.0.0.0:7878 (Press CTRL+C to quit)
ollama_1   | time=2024-04-05T09:16:37.175Z level=INFO source=payload_common.go:140 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cuda_v11 cpu rocm_v60000]"
ollama_1   | time=2024-04-05T09:16:37.182Z level=INFO source=gpu.go:115 msg="Detecting GPU type"
ollama_1   | time=2024-04-05T09:16:37.184Z level=INFO source=gpu.go:265 msg="Searching for GPU management library libcudart.so*"
ollama_1   | time=2024-04-05T09:16:37.197Z level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/tmp/ollama1007413018/runners/cuda_v11/libcudart.so.11.0]"
ollama_1   | time=2024-04-05T09:16:37.224Z level=INFO source=gpu.go:340 msg="Unable to load cudart CUDA management library /tmp/ollama1007413018/runners/cuda_v11/libcudart.so.11.0: cudart init failure: 35"
ollama_1   | time=2024-04-05T09:16:37.224Z level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so"
ollama_1   | time=2024-04-05T09:16:37.226Z level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
ollama_1   | time=2024-04-05T09:16:37.227Z level=INFO source=cpu_common.go:15 msg="CPU has AVX"
ollama_1   | time=2024-04-05T09:16:37.229Z level=INFO source=routes.go:1141 msg="no GPU detected"
backend_1  | INFO (uvicorn.access):  172.20.0.1:44790 - "GET / HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44800 - "GET /assets/vendor-BGjp6CLF.js HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44810 - "GET /assets/index-B8XGlb2P.js HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44822 - "GET /assets/index-1-JB3fie.css HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44826 - "GET /assets/textarea-BDbSZPOd.js HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44842 - "GET /assets/index-CfAbl2kf.js HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44858 - "GET /favicon.png HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44866 - "GET /apple-touch-icon.png HTTP/1.1" 200
backend_1  | INFO (uvicorn.access):  172.20.0.1:44870 - "GET /v1/session HTTP/1.1" 200
ollama_1   | [GIN] 2024/04/05 - 09:16:42 | 200 |   15.083232ms |      172.20.0.3 | GET      "/api/tags"
backend_1  | INFO (uvicorn.access):  172.20.0.1:44878 - "GET /v1/ollama/tags HTTP/1.1" 200

passing browser to https://myserver.com still bring 127.0.0.1 =( GET | http://127.0.0.1:7878/openui/index.html?buster=113

PPS seems hardcoded in backend/openui/dist/assets/Builder-BLsf4mqC.js ln27

vanpelt commented 3 months ago

Sorry about that, I see what's happening. The problem line in the codebase is here. You'll need to build the frontend in hosted mode before building the container. You can do that by running:

cd frontend
npm install
npm run build -- --mode hosted

That will generate new assets that are automatically copied to backend/openui/dist. If you rebuild the container the iframe will point to the github hosted page instead of localhost (it needs to be on a seperate domain for security reasons).

phoedos commented 3 months ago

now it's working. think this topic (dedicated service configuration + specific front buld for hosted mode) is also should be mentioned in readme =)