imoneoi / openchat

OpenChat: Advancing Open-source Language Models with Imperfect Data
https://openchat.team
Apache License 2.0
5.23k stars 400 forks source link

Dockerfile for openchat #86

Open anneum opened 10 months ago

anneum commented 10 months ago

I wanted to share my experience of successfully building a Dockerfile to deploy the openchat API as a container. This process came with some challenges, especially maintaining a stable conda environment during the Docker build, which I had to use, otherwise I didn't get the right versions and dependencies fixed.

I believe that my Dockerfile can simplify many things for users who want to use the openchat project.

The container is running on a host with CUDA version: 12.2.

FROM nvcr.io/nvidia/pytorch:23.10-py3
USER root

ENV DEBIAN_FRONTEND=noninteractive
RUN apt-get update
RUN apt-get -y install \
    wget \
    curl \
    python3 \
    python3-pip \
    python3-venv \
    python3-dev \
    --no-install-recommends

# Install Miniconda
ENV CONDA_DIR /opt/conda
RUN wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda.sh \
    && /bin/bash ~/miniconda.sh -b -p $CONDA_DIR
ENV PATH=$CONDA_DIR/bin:$PATH

# Create and initialize Conda environment
RUN conda create -y --name openchat python=3.11 \
    && echo "source activate openchat" > ~/.bashrc

# To keep the environment activated for subsequent RUN commands
SHELL ["/bin/bash", "--login", "-c"]

RUN pip3 install ochat

CMD ["conda", "run", "-n", "openchat", "python", "-m", "ochat.serving.openai_api_server", "--model", "openchat/openchat_3.5"]
# If you want to make it available from outside the container
# CMD ["conda", "run", "-n", "openchat", "python", "-m", "ochat.serving.openai_api_server", "--model", "openchat/openchat_3.5", "--host", "0.0.0.0", "--port", "18888"]
storuky commented 10 months ago

Conda inside docker container? 🤔 For what?

xiaocode337317439 commented 9 months ago

I think the model is better to mount out, how should I start

anneum commented 9 months ago

@storuky I could not get cuda 12 to stay installed without conda because during the installation of ochat, for whatever reason, cuda 11 was installed.

@xiaocode337317439 In my environment, the container is running in a Kubernetes cluster. Here is the corresponding YAML file where the model is stored on a pvc:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: openchat-api
  namespace: openchat
spec:
  selector:
    matchLabels:
      app: openchat-api
  template:
    metadata:
      labels:
        app: openchat-api
    spec:
      containers:
      - name: openchat-api
        imagePullPolicy: Always
        image: <image path>
        resources:
          limits:
            cpu: "32"
            memory: "250Gi"
            nvidia.com/gpu: 1
          requests:
            cpu: "0.5"
            memory: "10Gi"
        ports:
        - containerPort: 18888
          protocol: TCP
          name: api-port
        volumeMounts:
          # Persist model data
        - mountPath: /root/.cache/huggingface/hub
          name: models

      volumes:
      - name: models
        persistentVolumeClaim:
          claimName: models