open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
3.02k stars 2.33k forks source link

[exporter/elasticsearchexporter] transaction duration missing in traces received in Elasticsearch using elasticsearchexporter #14538

Closed ramdaspotale closed 1 year ago

ramdaspotale commented 2 years ago

What happened?

Description

I am using the latest open telemetry collector with Elasticsearch exporter to export traces to Elasticsearch instance. I previously used otlphttp exporter to export Traces to Elastic APM and consequently to Elasticsearch. although Elasticsearch exporter is able to push Traces to ES, the exported docs do not have "transaction.duration" or trace duration for the time it was running on the application, it was there in the docs received via Elastic APM. below is the collector config I am using and the sample trace received in Elasticsearch via Elastic APM(otlphttp exporter) and directly to ES(elasticsearch exporter).

Steps to Reproduce

Run opentelemetry collector 0.60.0 with configuration given and send traces to Elasticsearch instance using elasticsearch exporter.

Expected Result

I should get traces with transaction duration or trace duration data which is missing. below can be considered as expected results which has transaction.duration.us data.

Trace exported via otlphttp exporter to elastic APM to Elasticsearch: "_index": ".ds-traces-apm-default-2022.08.29-000001", "_id": "vTdLfoMBWrBaGggf-z_9", "_version": 1, "_score": 0, "_source": { "agent": { "name": "otlp", "version": "unknown" }, "data_stream.namespace": "default", "data_stream.type": "traces", "processor": { "name": "transaction", "event": "transaction" }, "labels": { "bmap_plan_name": "hj-otel3", "session": "0x34d0f2f6", "datasource": "pwdsn_db0", "dbAction": "ExecuteSelect", "bmap_plan_id": "63075731126d24faff9b56ca", "sqlCmd": "select INDEXDOCINFOID,IMPORT from EC_INDEXDOCINFO where (DOC_GUID not in (select o_docguid from dms_doc)) order by IMPORT" }, "observer": { "hostname": "elastic-apm-844b55bb54-8wvrt", "id": "d3a9b4fe-2d05-4a1d-9644-ac40f8529a36", "ephemeral_id": "8ff0ddec-ee11-4f03-9a1b-3f3c18cf425f", "type": "apm-server", "version": "8.3.3" }, "trace": { "id": "44b3c2c06e15d444a770b87daab45c0a" }, "@timestamp": "2022-09-27T09:34:05.445Z", "ecs": { "version": "1.12.0" }, "service": { "node": { "name": "7c7ab9aa-341b-4283-ba48-66e20ce65174" }, "framework": { "name": "PWDI" }, "name": "PWDI Server", "language": { "name": "unknown" } }, "data_stream.dataset": "apm", "event": { "agent_id_status": "missing", "ingested": "2022-09-27T09:34:12Z", "outcome": "unknown" }, "transaction": { "duration": { "us": 9167 }, "name": "PwDbOperation", "id": "c69db0111713fa42", "type": "unknown", "sampled": true }, "timestamp": { "us": 1664271245445453 } }

Actual Result

This is the trace I receive when exported via Elasticsearch exporter directly to Elasticsearch instance: "_index": "traces.projectwise", "_id": "rzdMfoMBWrBaGggfHUra", "_version": 1, "_score": 0, "_source": { "@timestamp": "2022-09-27T09:34:05.445453700Z", "Attributes.bmap.plan_id": "63075731126d24faff9b56ca", "Attributes.bmap.plan_name": "hj-otel3", "Attributes.datasource": "pwdsn_db0", "Attributes.dbAction": "ExecuteSelect", "Attributes.session": "0x34d0f2f6", "Attributes.sqlCmd": "select INDEXDOCINFOID,IMPORT from EC_INDEXDOCINFO where (DOC_GUID not in (select o_docguid from dms_doc)) order by IMPORT", "EndTimestamp": "2022-09-27T09:34:05.454621600Z", "Kind": "SPAN_KIND_SERVER", "Link": "[]", "Name": "PwDbOperation", "Resource.service.instance.id": "7c7ab9aa-341b-4283-ba48-66e20ce65174", "Resource.service.name": "PWDI Server", "SpanId": "c69db0111713fa42", "TraceId": "44b3c2c06e15d444a770b87daab45c0a", "TraceStatus": 0 }

Collector version

0.60.0

Environment information

Environment

Collector image: docker.io/otel/opentelemetry-collector-contrib:0.60.0 Elasticsearch: 8.3.3 Elastic APM: 8.3.3 Platform: Azure Kubernetes Service

OpenTelemetry Collector configuration

receivers:
      otlp:
        protocols:
          grpc:
            endpoint: 0.0.0.0:4317
          http:
            endpoint: 0.0.0.0:4318
    processors:
      memory_limiter:
        check_interval: 1s
        limit_percentage: 50
        spike_limit_percentage: 30
      batch:
        timeout: 10s
        send_batch_size: 10000
        send_batch_max_size: 11000
      attributes:
        actions:
          - key: bmap.plan_id
            value: 63075731126d24faff9b56ca
            action: insert
          - key: bmap.plan_name
            value: hj-otel3
            action: insert
    extensions:
      health_check:
      memory_ballast:
        size_in_percentage: 30
      pprof:
        endpoint: :1888
      zpages:
        endpoint: :55679
    exporters:
      logging:
        loglevel: debug
      otlphttp/insecure_no_verify:
        endpoint: https://apm.<REDACTED>.com:8200
        compression: none
        tls:
          insecure: false
          insecure_skip_verify: true
      elasticsearch/trace:
        endpoints: [https://elastic.<REDACTED>.com:9200]
        traces_index: traces.projectwise
        api_key: <REDACTED>
        tls:
          insecure: false
          insecure_skip_verify: true
    service:
      extensions: [pprof, zpages, health_check, memory_ballast]
      pipelines:
        traces:
          receivers: [otlp]
          processors: [memory_limiter,batch,attributes]
          exporters: [logging,otlphttp/insecure_no_verify,elasticsearch/trace]

Log output

No response

Additional context

No response

github-actions[bot] commented 2 years ago

Pinging code owners: @urso @faec @blakerouse. See Adding Labels via Comments if you do not have permissions to add labels yourself.

JaredTan95 commented 2 years ago

@ramdaspotale As far as I know, otlp span does not contains transaction. duration attributes. In your case, elastic apm received otlp tracing and calculated duration from the start and end time.

github-actions[bot] commented 1 year ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 1 year ago

This issue has been closed as inactive because it has been stale for 120 days with no activity.