vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
28.88k stars 4.29k forks source link

[Installation]: Adding opentelemetry packages in container image #9340

Open sanketsudake opened 1 week ago

sanketsudake commented 1 week ago

Your current environment

Using container image vllm/vllm-openai:v0.6.2

Process SpawnProcess-1:
Traceback (most recent call last):
  File "/usr/lib/python3.12/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.12/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.12/dist-packages/vllm/engine/multiprocessing/engine.py", line 388, in run_mp_engine
    engine = MQLLMEngine.from_engine_args(engine_args=engine_args,
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/vllm/engine/multiprocessing/engine.py", line 134, in from_engine_args
    engine_config = engine_args.create_engine_config()
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 1046, in create_engine_config
    observability_config = ObservabilityConfig(
                           ^^^^^^^^^^^^^^^^^^^^
  File "<string>", line 6, in __init__
  File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 1834, in __post_init__
    raise ValueError(
ValueError: OpenTelemetry is not available. Unable to configure 'otlp_traces_endpoint'. Ensure OpenTelemetry packages are installed. Original error:
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/dist-packages/vllm/tracing.py", line 14, in <module>
    from opentelemetry.context.context import Context
ModuleNotFoundError: No module named 'opentelemetry'

Possibly we should include opentelemetry packages in container image ?

  - "pip install \
      'opentelemetry-sdk>=1.26.0,<1.27.0' \
      'opentelemetry-api>=1.26.0,<1.27.0' \
      'opentelemetry-exporter-otlp>=1.26.0,<1.27.0' \
      'opentelemetry-semantic-conventions-ai>=0.4.1,<0.5.0'"

https://github.com/vllm-project/vllm/blob/16b24e7dcd8da5f2ac50f149daa77288fa8c14d7/.buildkite/test-pipeline.yaml#L140

How you are installing vllm

Using container image vllm/vllm-openai:v0.6.2

Passing arguments

  - args:
    - --model
    - <model>
    - --max-model-len
    - "4192"
    - --port
    - "8000"
    - --otlp-traces-endpoint
    - grpc://jaeger-collector.observability.svc:4317
    - --collect-detailed-traces
    - all

Before submitting a new issue...

simon-mo commented 1 week ago

PR welcomed.