Process SpawnProcess-1:
Traceback (most recent call last):
File "/usr/lib/python3.12/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/usr/lib/python3.12/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/lib/python3.12/dist-packages/vllm/engine/multiprocessing/engine.py", line 388, in run_mp_engine
engine = MQLLMEngine.from_engine_args(engine_args=engine_args,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/vllm/engine/multiprocessing/engine.py", line 134, in from_engine_args
engine_config = engine_args.create_engine_config()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 1046, in create_engine_config
observability_config = ObservabilityConfig(
^^^^^^^^^^^^^^^^^^^^
File "<string>", line 6, in __init__
File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 1834, in __post_init__
raise ValueError(
ValueError: OpenTelemetry is not available. Unable to configure 'otlp_traces_endpoint'. Ensure OpenTelemetry packages are installed. Original error:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/dist-packages/vllm/tracing.py", line 14, in <module>
from opentelemetry.context.context import Context
ModuleNotFoundError: No module named 'opentelemetry'
Possibly we should include opentelemetry packages in container image ?
[X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Your current environment
Using container image vllm/vllm-openai:v0.6.2
Possibly we should include opentelemetry packages in container image ?
https://github.com/vllm-project/vllm/blob/16b24e7dcd8da5f2ac50f149daa77288fa8c14d7/.buildkite/test-pipeline.yaml#L140
How you are installing vllm
Using container image
vllm/vllm-openai:v0.6.2
Passing arguments
Before submitting a new issue...