open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
3.02k stars 2.33k forks source link

Sentry Exporter Distributed Span Returning Invalid Span ID #5721

Closed tonnyadhi closed 1 year ago

tonnyadhi commented 3 years ago

Describe the bug Probabaly this related with #4141 which has been closed. I try to build and use latest fix from #4141 by building my own docker image from opentelemetry-collector-contrib main branch (dd44390). But sentry can not displayed the entire traces and returning invalid span id, like in the picture bellow

Cuplikan layar dari 2021-10-13 09-00-35

Meanwhile the same traces displayed correctly in jaeger, also this trace has been sent from opentelemetry-collector

Cuplikan layar dari 2021-10-13 09-03-32

Steps to reproduce Build and use docker image from main branch (dd44390). Here is my Dockerfile

FROM golang:1.17 AS build

WORKDIR /src
ADD . /src

RUN make otelcontribcol

FROM alpine:latest as certs
RUN apk --update add ca-certificates

FROM scratch

ARG USER_UID=10001
USER ${USER_UID}

COPY --from=certs /etc/ssl/certs/ca-certificates.crt /etc/ssl/certs/ca-certificates.crt
COPY --from=build /src/bin/otelcontribcol_linux_amd64 /otelcontribcol
ENTRYPOINT ["/otelcontribcol"]
EXPOSE 4317 55680 55679

After that, this docker deployed into GKE with Opentelemetry Operator

What did you expect to see? Traces on sentry displayed correctly on Sentry

What did you see instead? Traces return invalid parent span id

What version did you use? Version: main branch (dd44390)

What config did you use? Here is my opentelemetry operator config

apiVersion: opentelemetry.io/v1alpha1
kind: OpenTelemetryCollector
metadata:
  name: opentelemetry-collector-sentry 
  namespace: opentelemetry-collector
spec:
  mode: deployment
  image: ragnalinux/opentelemetry-collector-contrib:latest
  config: |
    receivers:
      otlp:
        protocols:
          grpc:
          http:
    processors:
      batch:
      memory_limiter:
        ballast_size_mib: 720 
        limit_mib: 1500
        spike_limit_mib: 512
        check_interval: 5s
    extensions:
      health_check: {}
    exporters:
      logging: {}
      sentry:
        dsn: SENTRY_DSN 
      jaeger:
        endpoint: JAEGER_URL
    service:
      extensions: [health_check]
      pipelines:
        traces/1:
          receivers: [otlp]
          processors: [memory_limiter, batch]
          exporters: [sentry, jaeger, logging]

Environment GKE with k8s v1.18

joshendriks commented 2 years ago

I encountered the same issue where my browser sends the following to the otel-collector (no parent span id because it is the root):

    "resourceSpans": [
        {
            "resource": {
                "attributes": [
                    {
                        "key": "service.name",
                        "value": {
                            "stringValue": "trace-demo-frontend"
                        }
                    },
                    {
                        "key": "telemetry.sdk.language",
                        "value": {
                            "stringValue": "webjs"
                        }
                    },
                    {
                        "key": "telemetry.sdk.name",
                        "value": {
                            "stringValue": "opentelemetry"
                        }
                    },
                    {
                        "key": "telemetry.sdk.version",
                        "value": {
                            "stringValue": "1.0.1"
                        }
                    }
                ],
                "droppedAttributesCount": 0
            },
            "instrumentationLibrarySpans": [
                {
                    "spans": [
                        {
                            "traceId": "74fd66c526af2a377b1da71c1268524b",
                            "spanId": "d0ff09ee7d4e3432",
                            "name": "fetch-span",
                            "kind": 1,
                            "startTimeUnixNano": 1656396617106100000,
                            "endTimeUnixNano": 1656396617107000000,
                            "attributes": [],
                            "droppedAttributesCount": 0,
                            "events": [],
                            "droppedEventsCount": 0,
                            "status": {
                                "code": 0
                            },
                            "links": [],
                            "droppedLinksCount": 0
                        }
                    ],
                    "instrumentationLibrary": {
                        "name": "example-tracer-web"
                    }
                }
            ]
        }
    ]
}

and the the following is sent to sentry (notice the value of the parent_span_id):

{"event_id":"8c17d53c33d14c39a48889ff13045daf","sent_at":"2022-06-28T06:10:22.211419517Z"}
{"type":"transaction","length":645}
{
   "contexts":{
      "trace":{
         "trace_id":"74fd66c526af2a377b1da71c1268524b",
         "span_id":"d0ff09ee7d4e3432",
         "parent_span_id":"0000000000000000",
         "description":"fetch-span"
      }
   },
   "event_id":"8c17d53c33d14c39a48889ff13045daf",
   "sdk":{
      "name":"sentry.opentelemetry",
      "version":"0.0.2"
   },
   "tags":{
      "library_name":"example-tracer-web",
      "library_version":"",
      "service.name":"trace-demo-frontend",
      "span_kind":"SPAN_KIND_INTERNAL",
      "telemetry.sdk.language":"webjs",
      "telemetry.sdk.name":"opentelemetry",
      "telemetry.sdk.version":"1.0.1"
   },
   "transaction":"fetch-span",
   "user":{

   },
   "type":"transaction",
   "start_timestamp":"2022-06-28T06:10:17.1061Z",
   "timestamp":"2022-06-28T06:10:17.107Z"
}

otel config:

receivers:
  otlp:
    protocols:
      grpc:
      http:
        cors:
          allowed_origins: ["*"]
          allowed_headers: ["*"]

exporters:       
  sentry:
    dsn: <redacted>
    insecure_skip_verify: true

processors:
  batch:

extensions:
  health_check:
  pprof:
    endpoint: :1888
  zpages:
    endpoint: :55679

service:
  telemetry:
    logs:
      level: "debug"
  extensions: [pprof, zpages, health_check]
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [sentry]

versions:

github-actions[bot] commented 1 year ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

github-actions[bot] commented 1 year ago

Pinging code owners for exporter/sentry: @AbhiPrasad. See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 1 year ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 1 year ago

This issue has been closed as inactive because it has been stale for 120 days with no activity.