Open vipulgote1999 opened 3 months ago
seems like opentelemetry should just be added to requirements-common.txt
, no?
Yes
Yes
Have you create a PR for it already?
No pr needs to be created
Otel usage doc: https://github.com/vllm-project/vllm/blob/main/examples/production_monitoring/Otel.md
According to https://github.com/vllm-project/vllm/pull/4687#pullrequestreview-2085770244, Otel packages are not included in official docker. You should install otel packages manually.
Your current environment
How would you like to use vllm
Description: When spinning up new docker container it is showing error for missing package.
Docker run command:
docker run -d --runtime nvidia --gpus all -v ~/Vipul/nltk_data:/home/user/nltk_data -v /home/binishb.ttl/Meta-Llama-3.1-8B-Instruct/:/root/Meta-Llama-3.1-8B-Instruct --env "HUGGING_FACE_HUB_TOKEN=xxxxxxxxxxxxxxxxx" -p 8514:8514 --ipc=host --env "CUDA_VISIBLE_DEVICES=1" --entrypoint "python3" vllm/vllm-openai:v0.5.4 -m vllm.entrypoints.openai.api_server --model /root/Meta-Llama-3.1-8B-Instruct --gpu-memory-utilization 0.9 --port 8514 --max-model-len 64000 --seed 42 --otlp-traces-endpoint "grpc://xxxxxxxxxx:4317" --enable-prefix-caching
Error:
Docker logs
Potential Fix: