DataDog / dd-trace-py

Datadog Python APM Client
https://ddtrace.readthedocs.io/
Other
537 stars 408 forks source link

OpenTelemetry API context propagation is broken starting with opentelemetry-api==1.25.0 #9917

Closed andreaimprovised closed 1 month ago

andreaimprovised commented 2 months ago

Summary of problem

Starting in opentelemetry-api==1.25.0, context propagation between ddtrace and opentelemetry does not work.

There is an exception logged that suggests that "ddcontextvars_context" is not used.

ERROR:opentelemetry.context:Failed to load context: ddcontextvars_context, fallback to contextvars_context
Traceback (most recent call last):
  File "/path/to/opentelemetry/context/__init__.py", line 52, in _load_runtime_context
    ).load()()
  File "/path/to/importlib_metadata/__init__.py", line 208, in load
    module = import_module(match.group('module'))
  File "/path/to/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/path/to/ddtrace/opentelemetry/_context.py", line 6, in <module>
    from opentelemetry.trace import NonRecordingSpan as OtelNonRecordingSpan
  File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "/path/to/opentelemetry/trace/__init__.py", line 89, in <module>
    from opentelemetry.trace.propagation import (
  File "/path/to/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "/path/to/opentelemetry/trace/propagation/__init__.py", line 16, in <module>
    from opentelemetry.context import create_key, get_value, set_value
ImportError: cannot import name 'create_key' from partially initialized module 'opentelemetry.context' (most likely due to a circular import) (/path/to/opentelemetry/context/__init__.py)

Running the example code from https://ddtrace.readthedocs.io/en/stable/api.html#opentelemetry-api, one can see that the traces no longer share a common trace ID. This results in the opentelemetry traces being totally independent from those created by ddtrace.

The opentelemetry traces are still forwarded to DataDog however.

Which version of dd-trace-py are you using?

2.9.3, the latest stable version as of writing this issue.

Which version of pip are you using?

24.1.2

Which libraries and their versions are you using?

alembic==1.13.2
annotated-types==0.7.0
anthropic==0.31.2
anyio==4.4.0
asgi-lifespan==2.1.0
asttokens==2.4.1
async-timeout==4.0.3
asyncpg==0.29.0
attrs==23.2.0
backoff==2.2.1
beautifulsoup4==4.12.3
blis==0.7.11
boto3==1.34.146
botocore==1.34.146
build==1.2.1
bytecode==0.15.1
catalogue==2.0.10
cattrs==23.2.3
certifi==2024.7.4
cffi==1.16.0
cfgv==3.4.0
chardet==5.2.0
charset-normalizer==3.3.2
click==8.1.7
cloudpathlib==0.18.1
colorama==0.4.6
confection==0.1.5
confluent-kafka==2.5.0
cryptography==43.0.0
cymem==2.0.8
dataclasses-json==0.6.7
datadog==0.49.1
ddsketch==3.0.1
ddtrace==2.9.3
decorator==5.1.1
deepdiff==7.0.1
Deprecated==1.2.14
distlib==0.3.8
distro==1.9.0
dnspython==2.6.1
docker==7.1.0
email_validator==2.2.0
emoji==2.12.1
en-core-web-sm @ https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.7.1/en_core_web_sm-3.7.1-py3-none-any.whl#sha256=86cc141f63942d4b2c5fcee06630fd6f904788d2f0ab005cce45aadb8fb73889
envier==0.5.2
exceptiongroup==1.2.2
executing==2.0.1
fast-depends==2.4.6
fastapi==0.111.1
fastapi-cli==0.0.4
faststream==0.5.15
filelock==3.13.1
filetype==1.2.0
freezegun==1.5.1
fsspec==2024.2.0
furl==2.1.3
gitdb==4.0.11
GitPython==3.1.43
greenlet==3.0.3
gunicorn==22.0.0
h11==0.14.0
httpcore==1.0.5
httptools==0.6.1
httpx==0.27.0
huggingface-hub==0.24.0
identify==2.6.0
idna==3.7
importlib-metadata==6.0.1
iniconfig==2.0.0
internal_libs==0.1.0
ipdb==0.13.13
ipython==8.26.0
jedi==0.19.1
Jinja2==3.1.3
jiter==0.5.0
jmespath==1.0.1
joblib==1.4.2
jsonpatch==1.33
jsonpath-python==1.0.6
jsonpointer==3.0.0
langchain-core==0.2.22
langchain-text-splitters==0.2.2
langcodes==3.4.0
langdetect==1.0.9
langsmith==0.1.93
language_data==1.2.0
layoutparser==0.3.4
loguru==0.7.2
lxml==5.2.2
Mako==1.3.5
marisa-trie==1.2.0
markdown-it-py==3.0.0
MarkupSafe==2.1.5
marshmallow==3.21.3
matplotlib-inline==0.1.7
mdurl==0.1.2
mpmath==1.3.0
murmurhash==1.0.10
mypy-extensions==1.0.0
nest-asyncio==1.6.0
networkx==3.2.1
nltk==3.8.1
nodeenv==1.9.1
numpy==1.26.3
opencv-python==4.10.0.84
opentelemetry-api==1.25.0
opentelemetry-sdk==1.25.0
opentelemetry-semantic-conventions==0.46b0
ordered-set==4.1.0
orderedmultidict==1.0.1
orjson==3.10.6
packaging==24.1
parso==0.8.4
pdf2image==1.17.0
pdfminer.six==20240706
pexpect==4.9.0
pgvector==0.3.2
pikepdf==9.0.0
pillow==10.2.0
pillow_heif==0.17.0
pip-tools==7.4.1
platformdirs==4.2.2
pluggy==1.5.0
pre-commit==3.7.1
preshed==3.0.9
prompt_toolkit==3.0.47
protobuf==5.27.2
psycopg2-binary==2.9.9
ptyprocess==0.7.0
pure_eval==0.2.3
pycparser==2.22
pydantic==2.8.2
pydantic_core==2.20.1
Pygments==2.18.0
pypdf==4.3.1
pypdfium2==4.30.0
pyproject_hooks==1.1.0
pytest==8.3.1
pytest-asyncio==0.23.8
pytest-mock==3.14.0
python-dateutil==2.9.0.post0
python-dotenv==1.0.1
python-iso639==2024.4.27
python-json-logger==2.0.7
python-magic==0.4.27
python-multipart==0.0.9
pytz==2024.1
PyYAML==6.0.1
rapidfuzz==3.9.4
regex==2024.5.15
requests==2.32.3
requests-toolbelt==1.0.0
rich==13.7.1
rollbar==1.0.0
s3transfer==0.10.2
safetensors==0.4.3
shellingham==1.5.4
six==1.16.0
smart-open==7.0.4
smmap==5.0.1
sniffio==1.3.1
soupsieve==2.5
spacy==3.7.5
spacy-legacy==3.0.12
spacy-loggers==1.0.5
SQLAlchemy==2.0.31
srsly==2.4.8
stack-data==0.6.3
starlette==0.37.2
sympy==1.12
tabulate==0.9.0
tenacity==8.5.0
testcontainers==4.7.2
thinc==8.2.5
tokenizers==0.19.1
tomli==2.0.1
torch==2.3.1
torchaudio==2.3.1
torchvision==0.18.1
tqdm==4.66.4
traitlets==5.14.3
transformers==4.42.4
typer==0.12.3
typing-inspect==0.9.0
typing_extensions==4.9.0
Unidecode==1.3.8
unstructured==0.14.8
unstructured-client==0.24.1
urllib3==2.2.2
uvicorn==0.30.3
uvloop==0.19.0
virtualenv==20.26.3
wasabi==1.1.3
watchfiles==0.22.0
wcwidth==0.2.13
weasel==0.4.1
websockets==12.0
wrapt==1.16.0
xmltodict==0.13.0
zipp==3.19.2

How can we reproduce your problem?

You can install ddtrace and opentelemetry-api at these versions

pip install ddtrace==2.9.3 opentelemetry-api==1.25.0 opentelemetry==1.25.0

And then run this script:

import os
import logging

# Must be set before ddtrace is imported!
os.environ["DD_TRACE_OTEL_ENABLED"] = "true"
os.environ["DD_TRACE_DEBUG"] = "true"

# Configure the root logger to output all logs at the DEBUG level
logging.basicConfig(level=logging.DEBUG)

print('1')
from ddtrace.opentelemetry import TracerProvider
print('2')
from opentelemetry.trace import set_tracer_provider
print('3')
set_tracer_provider(TracerProvider())
print('4')

from opentelemetry import trace
import ddtrace

oteltracer = trace.get_tracer(__name__)

with oteltracer.start_as_current_span("otel-span") as parent_span:
    parent_span.set_attribute("otel_key", "otel_val")
    with ddtrace.tracer.trace("ddtrace-span") as child_span:
        child_span.set_tag("dd_key", "dd_val")

What is the result that you get?

This the output of that script:

1
DEBUG:ddtrace.internal.module:<class 'ddtrace.internal.module.ModuleWatchdog'> installed
DEBUG:datadog.dogstatsd:Statsd buffering is disabled
DEBUG:datadog.dogstatsd:Statsd periodic buffer flush is disabled
debug mode has been enabled for the ddtrace logger
DEBUG:ddtrace:debug mode has been enabled for the ddtrace logger
git tags from env:
DEBUG:ddtrace.internal.gitmetadata:git tags from env:
git tags:
DEBUG:ddtrace.internal.gitmetadata:git tags:
Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
DEBUG:ddtrace.internal.datadog.profiling.ddup:Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094162958, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094162958, effective_rate=1.0), rules=[])
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[])
Statsd buffering is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd buffering is disabled
Statsd periodic buffer flush is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd periodic buffer flush is disabled
initialized processor EndpointCallCounterProcessor()
DEBUG:ddtrace._trace.processor:initialized processor EndpointCallCounterProcessor()
initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[])
DEBUG:ddtrace._trace.processor:initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[])
initialized trace processor TraceTagsProcessor()
DEBUG:ddtrace._trace.processor:initialized trace processor TraceTagsProcessor()
initialized processor TopLevelSpanProcessor()
DEBUG:ddtrace._trace.processor:initialized processor TopLevelSpanProcessor()
initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
DEBUG:ddtrace._trace.processor:initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=114394094219375, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
ERROR:opentelemetry.context:Failed to load context: ddcontextvars_context, fallback to contextvars_context
Traceback (most recent call last):
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/context/__init__.py", line 52, in _load_runtime_context
    ).load()()
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/importlib_metadata/__init__.py", line 208, in load
    module = import_module(match.group('module'))
  File "/Users/andrea.hutchinson/.local/share/mise/installs/python/3.10.14/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/opentelemetry/_context.py", line 6, in <module>
    from opentelemetry.trace import NonRecordingSpan as OtelNonRecordingSpan
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/trace/__init__.py", line 89, in <module>
    from opentelemetry.trace.propagation import (
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/ddtrace/internal/module.py", line 220, in _exec_module
    self.loader.exec_module(module)
  File "/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/trace/propagation/__init__.py", line 16, in <module>
    from opentelemetry.context import create_key, get_value, set_value
ImportError: cannot import name 'create_key' from partially initialized module 'opentelemetry.context' (most likely due to a circular import) (/Users/andrea.hutchinson/workspace/affinity/python/projects/semantic_search/.venv.darwin.arm64.python310/lib/python3.10/site-packages/opentelemetry/context/__init__.py)
2
3
4
RemoteConfigWorker created with polling interval 5
DEBUG:ddtrace.internal.remoteconfig.worker:RemoteConfigWorker created with polling interval 5
[16052][P: 13199] Register ASM Remote Config Callback
DEBUG:ddtrace.appsec._remoteconfiguration:[16052][P: 13199] Register ASM Remote Config Callback
[PID 16052] Subscriber ASM initialized
DEBUG:ddtrace.internal.remoteconfig._subscribers:[PID 16052] Subscriber ASM initialized
finishing span name='ddtrace-span' id=17972391284560862540 trace_id=136414600359553580485446959405673402872 parent_id=None service=None resource='ddtrace-span' type=None start=1721794322.944511 end=1721794322.9445229 duration=1.2e-05 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'dd_key': 'dd_val', 'language': 'python', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='ddtrace-span' id=17972391284560862540 trace_id=136414600359553580485446959405673402872 parent_id=None service=None resource='ddtrace-span' type=None start=1721794322.944511 end=1721794322.9445229 duration=1.2e-05 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'dd_key': 'dd_val', 'language': 'python', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
finishing span name='internal' id=7141925423790266001 trace_id=136414600359553580476710191863588696629 parent_id=None service=None resource='otel-span' type=None start=1721794322.94447 end=1721794322.974968 duration=0.030498 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='internal' id=7141925423790266001 trace_id=136414600359553580476710191863588696629 parent_id=None service=None resource='otel-span' type=None start=1721794322.94447 end=1721794322.974968 duration=0.030498 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a07f1200000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '082f76cdaff3472f918174830eba546e'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 16052} (enabled:True)

You can see there are two different trace IDs:

What is the result that you expected?

If you install previous versions of opentelemetry:

pip install opentelemetry-sdk==1.24.0 opentelemetry-api==1.24.0

Then run the script again, you get the correct behavior:

1
DEBUG:ddtrace.internal.module:<class 'ddtrace.internal.module.ModuleWatchdog'> installed
DEBUG:datadog.dogstatsd:Statsd buffering is disabled
DEBUG:datadog.dogstatsd:Statsd periodic buffer flush is disabled
debug mode has been enabled for the ddtrace logger
DEBUG:ddtrace:debug mode has been enabled for the ddtrace logger
git tags from env:
DEBUG:ddtrace.internal.gitmetadata:git tags from env:
git tags:
DEBUG:ddtrace.internal.gitmetadata:git tags:
Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
DEBUG:ddtrace.internal.datadog.profiling.ddup:Failed to import _ddup: No module named 'ddtrace.internal.datadog.profiling.ddup._ddup'
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078382333, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078382333, effective_rate=1.0), rules=[])
initialized RateSampler, sample 100.0% of traces
DEBUG:ddtrace.sampler:initialized RateSampler, sample 100.0% of traces
initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[])
DEBUG:ddtrace.sampler:initialized DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[])
Statsd buffering is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd buffering is disabled
Statsd periodic buffer flush is disabled
DEBUG:ddtrace.vendor.dogstatsd:Statsd periodic buffer flush is disabled
initialized processor EndpointCallCounterProcessor()
DEBUG:ddtrace._trace.processor:initialized processor EndpointCallCounterProcessor()
initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[]), single_span_rules=[])
DEBUG:ddtrace._trace.processor:initialized trace processor TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[]), single_span_rules=[])
initialized trace processor TraceTagsProcessor()
DEBUG:ddtrace._trace.processor:initialized trace processor TraceTagsProcessor()
initialized processor TopLevelSpanProcessor()
DEBUG:ddtrace._trace.processor:initialized processor TopLevelSpanProcessor()
initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
DEBUG:ddtrace._trace.processor:initialized processor SpanAggregator(_partial_flush_enabled=True, _partial_flush_min_spans=300, _trace_processors=[PeerServiceProcessor(), BaseServiceProcessor(), TraceSamplingProcessor(_compute_stats_enabled=False, sampler=DatadogSampler(agent_rates={}, limiter=RateLimiter(rate_limit=100, tokens=100, last_update_ns=115001078435750, effective_rate=1.0), rules=[]), single_span_rules=[]), TraceTagsProcessor()], _writer=AgentWriter(status=<ServiceStatus.STOPPED: 'stopped'>, _interval=1.0), _span_metrics={'spans_created': defaultdict(<class 'int'>, {}), 'spans_finished': defaultdict(<class 'int'>, {})})
2
3
4
trace 136414648451048226636539382981326167872 has 2 spans, 1 finished
DEBUG:ddtrace._trace.processor:trace 136414648451048226636539382981326167872 has 2 spans, 1 finished
finishing span name='ddtrace-span' id=17477878130697167055 trace_id=136414648451048226636539382981326167872 parent_id=5479540028024531364 service=None resource='ddtrace-span' type=None start=1721794929.884165 end=1721794929.884175 duration=1e-05 error=0 tags={'dd_key': 'dd_val'} metrics={} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='ddtrace-span' id=17477878130697167055 trace_id=136414648451048226636539382981326167872 parent_id=5479540028024531364 service=None resource='ddtrace-span' type=None start=1721794929.884165 end=1721794929.884175 duration=1e-05 error=0 tags={'dd_key': 'dd_val'} metrics={} (enabled:True)
RemoteConfigWorker created with polling interval 5
DEBUG:ddtrace.internal.remoteconfig.worker:RemoteConfigWorker created with polling interval 5
[18243][P: 13199] Register ASM Remote Config Callback
DEBUG:ddtrace.appsec._remoteconfiguration:[18243][P: 13199] Register ASM Remote Config Callback
[PID 18243] Subscriber ASM initialized
DEBUG:ddtrace.internal.remoteconfig._subscribers:[PID 18243] Subscriber ASM initialized
finishing span name='internal' id=5479540028024531364 trace_id=136414648451048226636539382981326167872 parent_id=None service=None resource='otel-span' type=None start=1721794929.884108 end=1721794929.884314 duration=0.000206 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a0817100000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '6179d361e3204f81987855579e9aa2f3'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 18243} (enabled:True)
DEBUG:ddtrace._trace.tracer:finishing span name='internal' id=5479540028024531364 trace_id=136414648451048226636539382981326167872 parent_id=None service=None resource='otel-span' type=None start=1721794929.884108 end=1721794929.884314 duration=0.000206 error=0 tags={'_dd.p.dm': '-0', '_dd.p.tid': '66a0817100000000', 'language': 'python', 'otel_key': 'otel_val', 'runtime-id': '6179d361e3204f81987855579e9aa2f3'} metrics={'_dd.top_level': 1, '_dd.tracer_kr': 1.0, '_sampling_priority_v1': 1, 'process_id': 18243} (enabled:True)

You can see there is only one trace ID: 136414648451048226636539382981326167872

emmettbutler commented 2 months ago

Thanks for the clear reproduction code, @andreaimprovised. We'll look into this.

cc @mabdinur