weaviate / Verba

Retrieval Augmented Generation (RAG) chatbot powered by Weaviate
BSD 3-Clause "New" or "Revised" License
6.38k stars 690 forks source link

Unable to load documents or access ollama model with local docker install #319

Open meltedhead opened 2 weeks ago

meltedhead commented 2 weeks ago

Description

I have installed Verba and Weaviate locally via docker and also installed Ollama in docker environment as well. These have both been installed in a local network that i have added to the docker compose file as below:

version: '3.8'

services:
  verba:
    build:
      context: ./
      dockerfile: Dockerfile
    networks:
      - verba_network
    ports:
      - 8000:8000
    environment:
      - WEAVIATE_URL_VERBA=http://weaviate:8080
      - OLLAMA_URL=$OLLAMA_URL # Use the private IP address of your Ubuntu host
      - OLLAMA_MODEL=$OLLAMA_MODEL
      - OLLAMA_EMBED_MODEL=$OLLAMA_EMBED_MODEL
      - GITHUB_TOKEN=$GITHUB_TOKEN
    volumes:
      - ./data:/data/
    depends_on:
      weaviate:
        condition: service_healthy
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://localhost:8000 || exit 1
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s

  weaviate:
    command:
      - --host
      - 0.0.0.0
      - --port
      - '8080'
      - --scheme
      - http
    image: semitechnologies/weaviate:1.25.10
    networks:
      - verba_network
    ports:
      - 8080:8080
      - 3000:8080
    volumes:
      - weaviate_data:/var/lib/weaviate
    restart: on-failure:0
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://localhost:8080/v1/.well-known/ready || exit 1
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s
    environment:
      QUERY_DEFAULTS_LIMIT: 25
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
      PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
      ENABLE_MODULES: 'e'
      CLUSTER_HOSTNAME: 'node1'

networks:
  verba_network:
    external: true

volumes:
  weaviate_data: {}

I have created my .env file to connect to Ollama docker URL and used the docker ollama host url with the .env file as

WEAVIATE_URL_VERBA=http://weaviate:8080

OLLAMA_MODEL=nemotron-mini# Use the private IP address of your Ubuntu host
OLLAMA_EMBED_MODE=nomic-embed-text

I then launch verba and select the docker setup. I try uploading documents but nothing is uploaded and then when i select settings -> manage suggestions the system crashes with the error Application error: a client-side exception has occurred (see the browser console for more information).

I have also tried installing with ollama not in a docker image and running locally with local host address and i still get the same issue.

Installation

If you installed via pip, please specify the version: pip 24.3.1

Weaviate Deployment

Configuration

Reader:x Chunker:x Embedder:x Retriever:x Generator:x

meltedhead commented 2 weeks ago

Screenshot 2024-11-07 114216 Admin panel just seems to hang like this.

meltedhead commented 2 weeks ago

I am using Ubuntu 22.04.4 LTS in google cloud workstation. It is definitely a permissions issue between verba being able to communicate with weaviate and ollama as this shows in the console.

Mesh: Logo Material: uf
page-25ed1cb73822cf4e.js:1 Mesh: Box Material: uf
/api/connect:1 

       Failed to load resource: the server responded with a status of 403 (Forbidden)Understand this errorAI
page-25ed1cb73822cf4e.js:1 WebSocket connection to 'wss://8000-gd88-us1-wks-prod.cluster-l2nixht5snfwmsxzpqqwcn2cms.cloudworkstations.dev/ws/generate_stream' failed: WebSocket is closed before the connection is established.
(anonymous) @ page-25ed1cb73822cf4e.js:1Understand this warningAI
page-25ed1cb73822cf4e.js:1 WebSocket connection to 'wss://8000-gd88-us1-wks-prod.cluster-l2nixht5snfwmsxzpqqwcn2cms.cloudworkstations.dev/ws/import_files' failed: WebSocket is closed before the connection is established.
(anonymous) @ page-25ed1cb73822cf4e.js:1Understand this warningAI
23-b149ce429217dd65.js:1 WebSocket Error: Event
(anonymous) @ 23-b149ce429217dd65.js:1Understand this errorAI
23-b149ce429217dd65.js:1 WebSocket connection died
(anonymous) @ 23-b149ce429217dd65.js:1Understand this errorAI
23-b149ce429217dd65.js:1 Import WebSocket Error: Event
(anonymous) @ 23-b149ce429217dd65.js:1Understand this errorAI
23-b149ce429217dd65.js:1 WebSocket connection died
(anonymous) @ 23-b149ce429217dd65.js:1Understand this errorAI
/api/get_meta:1 

       Failed to load resource: the server responded with a status of 403 (Forbidden)Understand this errorAI
page-25ed1cb73822cf4e.js:1 WebSocket connection opened to wss://8000-gd88-us1-wks-prod.cluster-l2nixht5snfwmsxzpqqwcn2cms.cloudworkstations.dev/ws/generate_stream
page-25ed1cb73822cf4e.js:1 Import WebSocket connection opened to wss://8000-gd88-us1-wks-prod.cluster-l2nixht5snfwmsxzpqqwcn2cms.cloudworkstations.dev/ws/import_files
b536a0f1-cb79989225eaf318.js:231 THREE.WebGLRenderer: Context Lost.
page-25ed1cb73822cf4e.js:1 Adding status message: Achievement unlocked: Welcome to Verba! SUCCESS
page-25ed1cb73822cf4e.js:1 Adding status message: Importing all files INFO
/api/get_all_suggestions:1 

       Failed to load resource: the server responded with a status of 403 (Forbidden)Understand this errorAI
23-b149ce429217dd65.js:1 TypeError: Cannot read properties of undefined (reading 'map')
    at eZ (page-25ed1cb73822cf4e.js:1:100147)
    at rE (fd9d1056-3c0a5e4377f054b9.js:1:40343)
    at l$ (fd9d1056-3c0a5e4377f054b9.js:1:59318)
    at iZ (fd9d1056-3c0a5e4377f054b9.js:1:117925)
    at ia (fd9d1056-3c0a5e4377f054b9.js:1:95164)
    at fd9d1056-3c0a5e4377f054b9.js:1:94986
    at il (fd9d1056-3c0a5e4377f054b9.js:1:94993)
    at oJ (fd9d1056-3c0a5e4377f054b9.js:1:92349)
    at oZ (fd9d1056-3c0a5e4377f054b9.js:1:91768)
    at MessagePort.T (23-b149ce429217dd65.js:1:84429)
(anonymous) @ 23-b149ce429217dd65.js:1Understand this errorAI
/api/get_all_suggestions:1 

       Failed to load resource: the server responded with a status of 403 (Forbidden)Understand this errorAI
page-25ed1cb73822cf4e.js:1 Import WebSocket connection closed cleanly, code=1005, reason=
page-25ed1cb73822cf4e.js:1 WebSocket connection closed cleanly, code=1005, reason=
uebmaster commented 1 week ago

The truth is that I've only just gotten it working, so I don't know much.

But the problem I had was that Verba couldn't reach ollama, so what I did was create an external network and connect everything to that network. I'll leave you my docker compose file to see if they help you:

docker network create ollama-docker

Docker compose Verba:

services:
  verba:
    build:
      context: ./
      dockerfile: Dockerfile
    ports:
      - 8001:8000
    environment:
      - WEAVIATE_URL_VERBA=http://weaviate:8080
#      - OPENAI_API_KEY=$OPENAI_API_KEY
#      - COHERE_API_KEY=$COHERE_API_KEY
      - OLLAMA_URL=http://yourip:7869 #by default is http://host.docker.internal:11434, surely works in that way
      - OLLAMA_MODEL=llama3.2:3b
      - OLLAMA_EMBED_MODEL=sentence-transformers/all-MiniLM-L6-v2
#      - UNSTRUCTURED_API_KEY=$UNSTRUCTURED_API_KEY
#      - UNSTRUCTURED_API_URL=$UNSTRUCTURED_API_URL
#      - GITHUB_TOKEN=$GITHUB_TOKEN
    networks:
       - ollama-docker
    volumes:
      - ./data:/data/
    depends_on:
      weaviate:
        condition: service_healthy
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://localhost:8000 || exit 1
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s

  weaviate:
    command:
      - --host
      - 0.0.0.0
      - --port
      - '8080'
      - --scheme
      - http
    image: semitechnologies/weaviate:1.25.10
    ports:
      - 8080:8080
      - 3000:8080
    networks:
      - ollama-docker
    volumes:
      - weaviate_data:/var/lib/weaviate
    restart: on-failure:0
    healthcheck:
      test: wget --no-verbose --tries=3 --spider http://localhost:8080/v1/.well-known/ready || exit 1
      interval: 5s
      timeout: 10s
      retries: 5
      start_period: 10s
    environment:
 #     OPENAI_APIKEY: $OPENAI_API_KEY
 #     COHERE_APIKEY: $COHERE_API_KEY
      QUERY_DEFAULTS_LIMIT: 25
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'true'
      PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
      ENABLE_MODULES: 'e'
      CLUSTER_HOSTNAME: 'node1'

volumes:
  weaviate_data: {}

networks:
  ollama-docker:
    external: true

Docker compose ollama (with GPU, i used this project: https://github.com/valiantlynx/ollama-docker )

services:
  app:
    build: .
    ports:
      - 8000:8000
      - 5678:5678
    volumes:
      - .:/code
    command: uvicorn src.main:app --host 0.0.0.0 --port 8000 --reload
    restart: always
    depends_on:
      - ollama
      - ollama-webui
    networks:
      - ollama-docker

  ollama:
    volumes:
      - ./ollama/ollama:/root/.ollama
    container_name: ollama
    pull_policy: always
    tty: true
    restart: unless-stopped
    image: ollama/ollama:latest
    ports:
      - 7869:11434
    environment:
      - OLLAMA_KEEP_ALIVE=24h
    networks:
      - ollama-docker
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]

  ollama-webui:
    image: ghcr.io/open-webui/open-webui:main
    container_name: ollama-webui
    volumes:
      - ./ollama/ollama-webui:/app/backend/data
    depends_on:
      - ollama
    ports:
      - 8082:8080
    environment: # https://docs.openwebui.com/getting-started/env-configuration#default_models
      - OLLAMA_BASE_URLS=http://host.docker.internal:7869 #comma separated ollama hosts
      - ENV=dev
      - WEBUI_AUTH=True
      - WEBUI_NAME=valiantlynx AI
      - WEBUI_URL=http://localhost:8082
      - WEBUI_SECRET_KEY=t0p-s3cr3t
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped
    networks:
      - ollama-docker

networks:
  ollama-docker:
    external: true

I suppose that the webui container is not necessary but it helps me to install llama3.2 model, Take into account that the port of Ollama points to 7869, it took me a long time to realize it and that's why Verba couldn't find Ollama hahaha

I hope you can make it works.