K0nkere / DL_Dice-detection-project

DnD dice detection with CNN and transfer learning / Project for ML Bookcamp
0 stars 0 forks source link

How to: Docker #11

Open K0nkere opened 1 year ago

K0nkere commented 1 year ago

Ignore checking context of files and folders while building the docker context Create .dockerignore file in the same directory as the Dockerfile:

# By default, ignore everything
*
# Add exception for the directories you actually want to include in the context
!project-source-code
!project-config-dir
# source files
!*.py
!*.sh

# ignore .git and .cache folders
.git
.cache

# ignore all *.class files in all folders, including build root
**/*.class

# ignore all markdown files (md) beside all README*.md other than README-secret.md
*.md
!README*.md
README-secret.md
K0nkere commented 3 months ago

Назначение переменных окружающей среды при билде одиночного образа

На основе файла с переменными env.sh вида

export BUCKET=chat-bot

Установить переменные

chmod +x env.sh
. env.sh

Докерфайл должен содержать строки вида (для каждой переменной)

ARG BUCKET
ENV BUCKET=$BUCKET

Билд образа

docker build -t service-image:latest \
    --build-arg BUCKET=$BUCKET \
    .
K0nkere commented 3 months ago

Установка env переменных при запуске готового (!) докер-образа

должен быть создан файл .env

docker run -it --rm -p 9696:9696 --env-file=".env" actions-server:latest
K0nkere commented 3 months ago

docker-compose syntax

version: '3'
networks:
  txt2python-network:
volumes:
  models:
  ollama:
  open-webui:

services:
  ollama-service:
    container_name: ollama-service
    build:
      context: .
      dockerfile: Dockerfile
    expose:
      - 11400
    environment:
      - OLLAMA_HOST=0.0.0.0:11400
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu] 
    volumes:
      - "/raid/ml-models:/models"
      - "ollama:/root/.ollama"
    ports:
      - "11400:11400"
    networks:
      - txt2python-network

  webUI:
    container_name: ollama-webUI
    image: ghcr.io/open-webui/open-webui:main
    expose:
      - 8080
    ports:
      - "8080:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama-service:11400
    depends_on:
      - ollama-service
    volumes:
      - open-webui:/app/backend/data
    networks:
      - txt2python-network