open-telemetry / opentelemetry-python-contrib

OpenTelemetry instrumentation for Python modules
https://opentelemetry.io
Apache License 2.0
740 stars 614 forks source link

Not able to auto-instrument Python application using gunicorn server #2086

Open xwgao opened 11 months ago

xwgao commented 11 months ago

Describe your environment Describe any aspect of your environment relevant to the problem, including your Python version, platform, version numbers of installed dependencies, information about your cloud hosting provider, etc. If you're reporting a problem with a specific version of a library in this repo, please check whether the problem has been fixed on main.

Environment:

Steps to reproduce Describe exactly how to reproduce the error. Include a code sample if applicable.

  1. I installed Community OpenTelemetry Operator v0.89.0 in my OpenShift 4.12.22 cluster.
  2. I created an OpenTelemetry instrumentation as below.
    apiVersion: opentelemetry.io/v1alpha1
    kind: Instrumentation
    metadata:
    annotations:
    instrumentation.opentelemetry.io/default-auto-instrumentation-apache-httpd-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-apache-httpd:1.0.3
    instrumentation.opentelemetry.io/default-auto-instrumentation-dotnet-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-dotnet:1.1.0
    instrumentation.opentelemetry.io/default-auto-instrumentation-go-image: ghcr.io/open-telemetry/opentelemetry-go-instrumentation/autoinstrumentation-go:v0.8.0-alpha
    instrumentation.opentelemetry.io/default-auto-instrumentation-java-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-java:1.31.0
    instrumentation.opentelemetry.io/default-auto-instrumentation-nginx-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-apache-httpd:1.0.3
    instrumentation.opentelemetry.io/default-auto-instrumentation-nodejs-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-nodejs:0.44.0
    instrumentation.opentelemetry.io/default-auto-instrumentation-python-image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-python:0.41b0
    labels:
    app.kubernetes.io/managed-by: opentelemetry-operator
    name: instrumentation
    namespace: my-namespace
    spec:
    apacheHttpd:
    configPath: /usr/local/apache2/conf
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-apache-httpd:1.0.3
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 128Mi
      requests:
        cpu: 1m
        memory: 128Mi
    version: "2.4"
    dotnet:
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-dotnet:1.1.0
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 128Mi
      requests:
        cpu: 50m
        memory: 128Mi
    exporter:
    endpoint: http://otel-collector-headless:4317
    go:
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-dotnet:1.1.0
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 32Mi
      requests:
        cpu: 50m
        memory: 32Mi
    java:
    env:
    - name: OTEL_INSTRUMENTATION_LIBERTY_ENABLED
      value: "true"
    - name: OTEL_METRICS_EXPORTER
      value: none
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-java:1.31.0
    resources:
      limits:
        cpu: 500m
        memory: 64Mi
      requests:
        cpu: 50m
        memory: 64Mi
    nginx:
    configFile: /etc/nginx/nginx.conf
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-apache-httpd:1.0.3
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 128Mi
      requests:
        cpu: 1m
        memory: 128Mi
    nodejs:
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-nodejs:0.44.0
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 128Mi
      requests:
        cpu: 50m
        memory: 128Mi
    propagators:
    - tracecontext
    - baggage
    - b3
    python:
    env:
    - name: OTEL_EXPORTER_OTLP_ENDPOINT
      value: http://otel-collector-headless:4318
    - name: OTEL_PYTHON_DISABLED_INSTRUMENTATIONS
      value: sqlite3
    - name: OTEL_PYTHON_LOG_CORRELATION
      value: "true"
    - name: OTEL_PYTHON_LOG_LEVEL
      value: debug
    - name: OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED
      value: "true"
    image: ghcr.io/open-telemetry/opentelemetry-operator/autoinstrumentation-python:0.41b0
    resourceRequirements:
      limits:
        cpu: 500m
        memory: 32Mi
      requests:
        cpu: 50m
        memory: 32Mi
    resource: {}
    sampler:
    argument: "1"
    type: parentbased_traceidratio
  3. I created an OpenTelemetry collector as below.

    apiVersion: opentelemetry.io/v1alpha1
    kind: OpenTelemetryCollector
    metadata:
    labels:
    app.kubernetes.io/managed-by: opentelemetry-operator
    name: otel
    namespace: my-namespace
    spec:
    config: |
    receivers:
      otlp:
        protocols:
          grpc:
          http:
    
    processors:
      batch:
        timeout: 10s
        send_batch_size: 10000
      metricstransform:
        transforms:
          - include: mas-core.duration
            match_type: regexp
            action: update
            operations:
              - action: update_label
                label: http.url
                new_label: url
              - action: update_label
                label: http.method
                new_label: method
              - action: update_label
                label: http.status_code
                new_label: code
    
    exporters:
      logging:
        verbosity: detailed
      prometheus:
        endpoint: "0.0.0.0:8889"
        send_timestamps: true
        metric_expiration: 1440m
    
    connectors:
      spanmetrics:
        namespace: mas-core
        histogram:
          unit: s
          explicit:
            buckets: [10ms, 100ms, 200ms, 400ms, 800ms, 1s, 1200ms, 1400ms, 1600ms, 1800ms, 2s, 4s, 6s, 8s, 10s]
        dimensions:
          - name: http.method
          - name: http.status_code
          - name: http.url
          - name: http.route
          - name: http.host
    
    service:
      pipelines:
        traces:
          receivers: [otlp]
          processors: [batch]
          exporters: [spanmetrics, logging]
        metrics:
          receivers: [spanmetrics]
          processors: [batch, metricstransform]
          exporters: [prometheus, logging]
    image: ghcr.io/open-telemetry/opentelemetry-collector-releases/opentelemetry-collector-contrib:0.89.0
    ingress:
    route: {}
    managementState: managed
    mode: statefulset
    observability:
    metrics: {}
    podDisruptionBudget:
    maxUnavailable: 1
    replicas: 1
    resources: {}
    targetAllocator:
    prometheusCR:
      scrapeInterval: 30s
    resources: {}
    updateStrategy: {}
    upgradeStrategy: automatic
  4. I added below annotation into the deployment for my Python application which uses gunicorn server.
    instrumentation.opentelemetry.io/inject-python: "instrumentation
  5. As per https://github.com/open-telemetry/opentelemetry-python/issues/2038, I found that currently auto-instrumentation for Python app using gunicorn server is not supported yet. So I added the following code into gunicorn.conf.py of my Python application.
    
    from opentelemetry import trace
    from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
    from opentelemetry.sdk.resources import Resource
    from opentelemetry.sdk.trace import TracerProvider
    from opentelemetry.sdk.trace.export import BatchSpanProcessor

def post_fork(server, worker): from opentelemetry.instrumentation.auto_instrumentation import sitecustomize server.log.info("Worker spawned (pid: %s)", worker.pid) service_name = os.getenv("OTEL_SERVICE_NAME") resource = Resource.create(attributes={ "service.name": service_name })

trace.set_tracer_provider(TracerProvider(resource=resource))
otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
span_processor = BatchSpanProcessor(
    OTLPSpanExporter(endpoint=otlp_endpoint)
)
trace.get_tracer_provider().add_span_processor(span_processor)
6. Then after the pod container of my deployment started, I still can not get the traces exported to my otel collector.

**What is the expected behavior?**
What did you expect to see?

The traces of my Python application can be exported to otel collector with auto-instrumentation.

**What is the actual behavior?**
What did you see instead?

The traces of my Python application can not be exported to otel collector with auto-instrumentation.

**Additional context**
And I also found the following error messages from the pod log of my deployment.

[2023-12-08 06:58:08 +0000] [57] [ERROR] Exception in worker process Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/gunicorn/arbiter.py", line 609, in spawn_worker worker.init_process() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/geventlet.py", line 143, in init_process super().init_process() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/base.py", line 134, in init_process self.load_wsgi() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi self.wsgi = self.app.wsgi() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/base.py", line 67, in wsgi self.callable = self.load() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 58, in load return self.load_wsgiapp() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp return util.import_app(self.app_uri) File "/usr/local/lib/python3.9/site-packages/gunicorn/util.py", line 371, in import_app mod = importlib.import_module(module) File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1030, in _gcd_import File "", line 1007, in _find_and_load File "", line 986, in _find_and_load_unlocked File "", line 680, in _load_unlocked File "", line 850, in exec_module File "", line 228, in _call_with_frames_removed File "/opt/ibm/internalapi/internalapi.py", line 22, in from api import usersAPI, workspacesAPI, applicationsAPI, metadataAPI, openidAPI, messageAPI, groupsAPI, datadictionaryAPI, manageworkspaceAPI, bindingsAPI, passwordPolicyAPI, idpsAPI File "/opt/ibm/internalapi/api/init.py", line 15, in from api.metadata import metadataAPI File "/opt/ibm/internalapi/api/metadata.py", line 17, in from managers import MetadataMgr File "/opt/ibm/internalapi/managers/init.py", line 15, in from managers.datadictionary import DataDictionaryMgr File "/opt/ibm/internalapi/managers/datadictionary.py", line 41, in k8sClient = k8sUtil.dynClient File "/usr/local/lib/python3.9/site-packages/mas/utils/k8s/k8sUtil.py", line 105, in dynClient self._dynClient = DynamicClient(k8s_client) File "/usr/local/lib/python3.9/site-packages/openshift/dynamic/client.py", line 40, in init K8sDynamicClient.init(self, client, cache_file=cache_file, discoverer=discoverer) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 84, in init self.discoverer = discoverer(self, cache_file) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 228, in init Discoverer.init(self, client, cache_file) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 54, in init self.init_cache() File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 70, in init_cache self._load_server_info() File "/usr/local/lib/python3.9/site-packages/openshift/dynamic/discovery.py", line 98, in _load_server_info 'kubernetes': self.client.request('get', '/version', serializer=just_json) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 55, in inner resp = func(self, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 270, in request api_response = self.client.call_api( File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 348, in call_api return self.call_api(resource_path, method, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 180, in call_api response_data = self.request( File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 373, in request return self.rest_client.GET(url, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/rest.py", line 241, in GET return self.request("GET", url, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/rest.py", line 214, in request r = self.pool_manager.request(method, url, File "/otel-auto-instrumentation-python/urllib3/_request_methods.py", line 110, in request return self.request_encode_url( File "/otel-auto-instrumentation-python/urllib3/_request_methods.py", line 143, in request_encode_url return self.urlopen(method, url, extra_kw) File "/otel-auto-instrumentation-python/urllib3/poolmanager.py", line 443, in urlopen response = conn.urlopen(method, u.request_uri, kw) File "/otel-auto-instrumentation-python/wrapt/wrappers.py", line 669, in call return self._self_wrapper(self.wrapped, self._self_instance, File "/otel-auto-instrumentation-python/opentelemetry/instrumentation/urllib3/init.py", line 243, in instrumented_urlopen response = wrapped(*args, **kwargs) File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 790, in urlopen response = self._make_request( File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 467, in _make_request self._validate_conn(conn) File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 1092, in _validate_conn conn.connect() File "/otel-auto-instrumentation-python/urllib3/connection.py", line 642, in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( File "/otel-auto-instrumentation-python/urllib3/connection.py", line 735, in _ssl_wrap_socket_and_match_hostname context = create_urllib3context( File "/otel-auto-instrumentation-python/urllib3/util/ssl.py", line 292, in create_urllib3_context context.minimum_version = TLSVersion.TLSv1_2 File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) [Previous line repeated 457 more times] RecursionError: maximum recursion depth exceeded [2023-12-08 06:58:08 +0000] [57] [INFO] Worker exiting (pid: 57) CPU_LIMITS for INTERNALAPI: 2 [2023-12-08 06:58:08 +0000] [62] [ERROR] Exception in worker process Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/gunicorn/arbiter.py", line 609, in spawn_worker worker.init_process() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/geventlet.py", line 143, in init_process super().init_process() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/base.py", line 134, in init_process self.load_wsgi() File "/usr/local/lib/python3.9/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi self.wsgi = self.app.wsgi() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/base.py", line 67, in wsgi self.callable = self.load() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 58, in load return self.load_wsgiapp() File "/usr/local/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp return util.import_app(self.app_uri) File "/usr/local/lib/python3.9/site-packages/gunicorn/util.py", line 371, in import_app mod = importlib.import_module(module) File "/usr/local/lib/python3.9/importlib/init.py", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "", line 1030, in _gcd_import File "", line 1007, in _find_and_load File "", line 986, in _find_and_load_unlocked File "", line 680, in _load_unlocked File "", line 850, in exec_module File "", line 228, in _call_with_frames_removed File "/opt/ibm/internalapi/internalapi.py", line 22, in from api import usersAPI, workspacesAPI, applicationsAPI, metadataAPI, openidAPI, messageAPI, groupsAPI, datadictionaryAPI, manageworkspaceAPI, bindingsAPI, passwordPolicyAPI, idpsAPI File "/opt/ibm/internalapi/api/init.py", line 15, in from api.metadata import metadataAPI File "/opt/ibm/internalapi/api/metadata.py", line 17, in from managers import MetadataMgr File "/opt/ibm/internalapi/managers/init.py", line 15, in from managers.datadictionary import DataDictionaryMgr File "/opt/ibm/internalapi/managers/datadictionary.py", line 41, in k8sClient = k8sUtil.dynClient File "/usr/local/lib/python3.9/site-packages/mas/utils/k8s/k8sUtil.py", line 105, in dynClient self._dynClient = DynamicClient(k8s_client) File "/usr/local/lib/python3.9/site-packages/openshift/dynamic/client.py", line 40, in init K8sDynamicClient.init(self, client, cache_file=cache_file, discoverer=discoverer) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 84, in init self.discoverer = discoverer(self, cache_file) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 228, in init Discoverer.init(self, client, cache_file) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 54, in init self.init_cache() File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/discovery.py", line 70, in init_cache self._load_server_info() File "/usr/local/lib/python3.9/site-packages/openshift/dynamic/discovery.py", line 98, in _load_server_info 'kubernetes': self.client.request('get', '/version', serializer=just_json) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 55, in inner resp = func(self, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/kubernetes/dynamic/client.py", line 270, in request api_response = self.client.call_api( File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 348, in call_api return self.call_api(resource_path, method, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 180, in call_api response_data = self.request( File "/usr/local/lib/python3.9/site-packages/kubernetes/client/api_client.py", line 373, in request return self.rest_client.GET(url, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/rest.py", line 241, in GET return self.request("GET", url, File "/usr/local/lib/python3.9/site-packages/kubernetes/client/rest.py", line 214, in request r = self.pool_manager.request(method, url, File "/otel-auto-instrumentation-python/urllib3/_request_methods.py", line 110, in request return self.request_encode_url( File "/otel-auto-instrumentation-python/urllib3/_request_methods.py", line 143, in request_encode_url return self.urlopen(method, url, extra_kw) File "/otel-auto-instrumentation-python/urllib3/poolmanager.py", line 443, in urlopen response = conn.urlopen(method, u.request_uri, kw) File "/otel-auto-instrumentation-python/wrapt/wrappers.py", line 669, in call return self._self_wrapper(self.wrapped, self._self_instance, File "/otel-auto-instrumentation-python/opentelemetry/instrumentation/urllib3/init.py", line 243, in instrumented_urlopen response = wrapped(*args, **kwargs) File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 790, in urlopen response = self._make_request( File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 467, in _make_request self._validate_conn(conn) File "/otel-auto-instrumentation-python/urllib3/connectionpool.py", line 1092, in _validate_conn conn.connect() File "/otel-auto-instrumentation-python/urllib3/connection.py", line 642, in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( File "/otel-auto-instrumentation-python/urllib3/connection.py", line 735, in _ssl_wrap_socket_and_match_hostname context = create_urllib3context( File "/otel-auto-instrumentation-python/urllib3/util/ssl.py", line 292, in create_urllib3_context context.minimum_version = TLSVersion.TLSv1_2 File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) File "/usr/local/lib/python3.9/ssl.py", line 587, in minimum_version super(SSLContext, SSLContext).minimum_version.set(self, value) [Previous line repeated 457 more times] RecursionError: maximum recursion depth exceeded

Badrmoh commented 11 months ago

see https://github.com/open-telemetry/opentelemetry-python-contrib/issues/385#issuecomment-808792269

xwgao commented 11 months ago

@Badrmoh I've already imported the sitecustomize, my code is as below. But still got the same result. I also tried removing the other lines from post_fork (only reserve the first line which imports sitecustomize), the result is the same.

from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

def post_fork(server, worker):
    from opentelemetry.instrumentation.auto_instrumentation import sitecustomize
    server.log.info("Worker spawned (pid: %s)", worker.pid)
    service_name = os.getenv("OTEL_SERVICE_NAME")
    resource = Resource.create(attributes={
        "service.name": service_name
    })

    trace.set_tracer_provider(TracerProvider(resource=resource))
    otlp_endpoint = os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT")
    span_processor = BatchSpanProcessor(
        OTLPSpanExporter(endpoint=otlp_endpoint)
    )
    trace.get_tracer_provider().add_span_processor(span_processor)
xwgao commented 11 months ago

@Badrmoh Is there any update? Thanks a lot.

Badrmoh commented 11 months ago

Sorry, I am not a maintainer. Hope you solve it.

lzpfmh commented 4 months ago

Find some related configuration instructions https://grafana.com/docs/grafana-cloud/monitor-applications/application-observability/setup/instrument/python/ https://grafana.com/docs/grafana-cloud/monitor-applications/application-observability/setup/instrument/python/gunicorn/ py code in above doc

import logging
from uuid import uuid4

from opentelemetry import metrics, trace
from opentelemetry.exporter.otlp.proto.grpc._log_exporter import (
    OTLPLogExporter,
)
from opentelemetry.exporter.otlp.proto.grpc.metric_exporter import (
    OTLPMetricExporter,
)
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter,
)
# support for logs is currently experimental
from opentelemetry.sdk._logs import LoggerProvider, LoggingHandler
from opentelemetry.sdk._logs.export import BatchLogRecordProcessor
from opentelemetry.sdk.metrics import MeterProvider
from opentelemetry.sdk.metrics.export import PeriodicExportingMetricReader
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.resources import SERVICE_INSTANCE_ID
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

# your gunicorn config here
# bind = "127.0.0.1:8000"

collector_endpoint = "http://localhost:4317"

def post_fork(server, worker):
    server.log.info("Worker spawned (pid: %s)", worker.pid)

    resource = Resource.create(
        attributes={
            # each worker needs a unique service.instance.id to distinguish the created metrics in prometheus
            SERVICE_INSTANCE_ID: str(uuid4()),
            "worker": worker.pid,
        }
    )

    tracer_provider = TracerProvider(resource=resource)
    tracer_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter(endpoint=collector_endpoint)))
    trace.set_tracer_provider(tracer_provider)

    metrics.set_meter_provider(
        MeterProvider(
            resource=resource,
            metric_readers=[(PeriodicExportingMetricReader(
                OTLPMetricExporter(endpoint=collector_endpoint)
            ))],
        )
    )

    logger_provider = LoggerProvider(resource=resource)
    logger_provider.add_log_record_processor(BatchLogRecordProcessor(OTLPLogExporter(endpoint=collector_endpoint)))
    logging.getLogger().addHandler(LoggingHandler(level=logging.NOTSET, logger_provider=logger_provider))
xwgao commented 4 months ago

@lzpfmh Thanks a lot for your comments. I tried with your code, the only difference is that I set the value collector_endpoint to be http://otel-collector-headless:4317 which is the address of my OTEL collector service. But there is no traces exported to OTEL collector. Thanks.