Closed AlexPerkin closed 3 weeks ago
Here is a working docker-compose.yml:
version: '3.9'
services:
# https://hub.docker.com/r/3x3cut0r/privategpt
privategpt:
container_name: privategpt
image: 3x3cut0r/privategpt
restart: unless-stopped
environment:
LLM_MODE: 'ollama'
OLLAMA_API_BASE: 'http://ollama:11434'
OLLAMA_EMBEDDING_API_BASE: 'http://ollama:11434'
OLLAMA_LLM_MODEL: 'llama3:latest'
ports:
- 8080:8080/tcp
depends_on:
- ollama
# https://hub.docker.com/r/ollama/ollama
ollama:
container_name: ollama
image: ollama/ollama
restart: unless-stopped
ports:
- 11434:11434/tcp
after running this the first time, you need to download the ollama model inside the container first:
docker exec -it ollama ollama pull llama3:latest
Hi, Please provide a sample docker-compose.yml file for privategpt project in case of using ollama. Thanks in advance