Starting in opentelemetry-api==1.25.0, context propagation between ddtrace and opentelemetry does not work.
There is an exception logged that suggests that "ddcontextvars_context" is not used.
ERROR:opentelemetry.context:Failed to load context: ddcontextvars_context, fallback to contextvars_context
Traceback (most recent call last):
File "/path/to/opentelemetry/context/__init__.py", line 52, in _load_runtime_context
).load()()
File "/path/to/importlib_metadata/__init__.py", line 208, in load
module = import_module(match.group('module'))
File "/path/to/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/path/to/ddtrace/opentelemetry/_context.py", line 6, in <module>
from opentelemetry.trace import NonRecordingSpan as OtelNonRecordingSpan
File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "/path/to/opentelemetry/trace/__init__.py", line 89, in <module>
from opentelemetry.trace.propagation import (
File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "/path/to/opentelemetry/trace/propagation/__init__.py", line 16, in <module>
from opentelemetry.context import create_key, get_value, set_value
ImportError: cannot import name 'create_key' from partially initialized module 'opentelemetry.context' (most likely due to a circular import) (/path/to/opentelemetry/context/__init__.py)
import os
import logging
# Must be set before ddtrace is imported!
os.environ["DD_TRACE_OTEL_ENABLED"] = "true"
os.environ["DD_TRACE_DEBUG"] = "true"
# Configure the root logger to output all logs at the DEBUG level
logging.basicConfig(level=logging.DEBUG)
print('1')
from ddtrace.opentelemetry import TracerProvider
print('2')
from opentelemetry.trace import set_tracer_provider
print('3')
set_tracer_provider(TracerProvider())
print('4')
from opentelemetry import trace
import ddtrace
oteltracer = trace.get_tracer(__name__)
with oteltracer.start_as_current_span("otel-span") as parent_span:
parent_span.set_attribute("otel_key", "otel_val")
with ddtrace.tracer.trace("ddtrace-span") as child_span:
child_span.set_tag("dd_key", "dd_val")
What is the result that you get?
This the output of that script:
1
DEBUG:ddtrace.internal.module:<class 'ddtrace.internal.module.ModuleWatchdog'> installed
DEBUG:datadog.dogstatsd:Statsd buffering is disabled
DEBUG:datadog.dogstatsd:Statsd periodic buffer flush is disabled
debug mode has been enabled for the ddtrace logger
DEBUG:ddtrace:debug mode has been enabled for the ddtrace logger
git tags from env:
DEBUG:ddtrace.internal.gitmetadata:git tags from env:
git tags:
DEBUG:ddtrace.internal.gitmetadata:git tags:
Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
DEBUG:ddtrace.internal.datadog.profiling.ddup:Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094162958, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094162958, effective_rate=1.0), rules=[])
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[])
Statsd buffering is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd buffering is disabled
Statsd periodic buffer flush is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd periodic buffer flush is disabled
initialized processor EndpointCallCounterProcessor()
DEBUG:ddtrace._trace.processor:initialized processor EndpointCallCounterProcessor()
initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[])
DEBUG:ddtrace._trace.processor:initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[])
initialized trace processor TraceTagsProcessor()
DEBUG:ddtrace._trace.processor:initialized trace processor TraceTagsProcessor()
initialized processor TopLevelSpanProcessor()
DEBUG:ddtrace._trace.processor:initialized processor TopLevelSpanProcessor()
initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
DEBUG:ddtrace._trace.processor:initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
ERROR:opentelemetry.context:Failed to load context: ddcontextvars_context, fallback to contextvars_context
Traceback (most recent call last):
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/context/__init__.py", line 52, in _load_runtime_context
).load()()
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/importlib_metadata/__init__.py", line 208, in load
module = import_module(match.group('module'))
File "/Users/andrea.hutchinson/.local/share/mise/installs/python/3.10.14/lib/python3.10/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/opentelemetry/_context.py", line 6, in <module>
from opentelemetry.trace import NonRecordingSpan as OtelNonRecordingSpan
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/trace/__init__.py", line 89, in <module>
from opentelemetry.trace.propagation import (
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
self.loader.exec_module(module)
File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/trace/propagation/__init__.py", line 16, in <module>
from opentelemetry.context import create_key, get_value, set_value
ImportError: cannot import name 'create_key' from partially initialized module 'opentelemetry.context' (most likely due to a circular import) (/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/context/__init__.py)
2
3
4
RemoteConfigWorker created with polling interval 5
DEBUG:ddtrace.internal.remoteconfig.worker:RemoteConfigWorker created with polling interval 5
[16052][P: 13199] Register ASM Remote Config Callback
DEBUG:ddtrace.appsec._remoteconfiguration:[16052][P: 13199] Register ASM Remote Config Callback
[PID 16052] Subscriber ASM initialized
DEBUG:ddtrace.internal.remoteconfig._subscribers:[PID 16052] Subscriber ASM initialized
finishing span name='ddtrace-span' id=17972391284560862540 trace_id=136414600359553580485446959405673402872 parent_id=None service=None resource='ddtrace-span' type=None start=1721794322.944511 end=1721794322.9445229 duration=1.2e-05 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'dd_key': 'dd_val', 'language': 'python', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='ddtrace-span' id=17972391284560862540 trace_id=136414600359553580485446959405673402872 parent_id=None service=None resource='ddtrace-span' type=None start=1721794322.944511 end=1721794322.9445229 duration=1.2e-05 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'dd_key': 'dd_val', 'language': 'python', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
finishing span name='internal' id=7141925423790266001 trace_id=136414600359553580476710191863588696629 parent_id=None service=None resource='otel-span' type=None start=1721794322.94447 end=1721794322.974968 duration=0.030498 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='internal' id=7141925423790266001 trace_id=136414600359553580476710191863588696629 parent_id=None service=None resource='otel-span' type=None start=1721794322.94447 end=1721794322.974968 duration=0.030498 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
You can see there are two different trace IDs:
136414600359553580485446959405673402872
136414600359553580476710191863588696629
What is the result that you expected?
If you install previous versions of opentelemetry:
Summary of problem
Starting in opentelemetry-api==1.25.0, context propagation between ddtrace and opentelemetry does not work.
There is an exception logged that suggests that "ddcontextvars_context" is not used.
Running the example code from https://ddtrace.readthedocs.io/en/stable/api.html#opentelemetry-api, one can see that the traces no longer share a common trace ID. This results in the opentelemetry traces being totally independent from those created by ddtrace.
The opentelemetry traces are still forwarded to DataDog however.
Which version of dd-trace-py are you using?
2.9.3, the latest stable version as of writing this issue.
Which version of pip are you using?
24.1.2
Which libraries and their versions are you using?
How can we reproduce your problem?
You can install ddtrace and opentelemetry-api at these versions
And then run this script:
What is the result that you get?
This the output of that script:
You can see there are two different trace IDs:
What is the result that you expected?
If you install previous versions of opentelemetry:
Then run the script again, you get the correct behavior:
You can see there is only one trace ID: 136414648451048226636539382981326167872