open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
2.97k stars 2.3k forks source link

Memory leak problem with Opentelemetry Collector and tailsampling #32551

Open jaysonsantos opened 5 months ago

jaysonsantos commented 5 months ago

Component(s)

processor/tailsampling

What happened?

Description

After a few days, sometimes weeks, the collector starts to pile up memory and stops receiving spans.

Steps to Reproduce

Keep running the collector for enough time so the memory starts to grow

Expected Result

The memory should be a constant with the number of traces received

Actual Result

The memory never goes down

Collector version

0.89.0

Environment information

Environment

Docker image ghcr.io/open-telemetry/opentelemetry-collector-releases/opentelemetry-collector-contrib

OpenTelemetry Collector configuration

exporters:
  logging:
    verbosity: basic
  otlp/newrelic:
    compression: gzip
    endpoint: endpoint:4317
    headers:
      api-key: token
extensions:
  health_check: null
  pprof:
    endpoint: 0.0.0.0:1777
  zpages: null
processors:
  batch:
    send_batch_size: 10000
    timeout: 10s
  batch/sampled:
    send_batch_size: 10000
    timeout: 10s
  filter/newrelic_and_otel:
    error_mode: ignore
    traces:
      span:
        - name == "TokenLinkingSubscriber.withNRToken"
  memory_limiter:
    check_interval: 5s
    limit_mib: 3800
    spike_limit_mib: 1000
  resourcedetection/system:
    detectors:
      - env
      - system
    override: false
    timeout: 2s
  tail_sampling:
    decision_wait: 60s
    expected_new_traces_per_sec: 10000
    num_traces: 50000000
    policies:
      - name: always_sample_error
        status_code:
          status_codes:
            - ERROR
        type: status_code
      - and:
          and_sub_policy:
            - name: routes
              string_attribute:
                enabled_regex_matching: true
                key: http.route
                values:
                  - /health
                  - /(actuator|sys)/health
              type: string_attribute
            - name: probabilistic-policy
              probabilistic:
                sampling_percentage: 0.1
              type: probabilistic
        name: health_endpoints
        type: and
      - name: sample_10_percent
        probabilistic:
          sampling_percentage: 10
        type: probabilistic
      - latency:
          threshold_ms: 3000
        name: slow-requests
        type: latency
receivers:
  otlp:
    protocols:
      grpc: null
      http: null
service:
  extensions:
    - zpages
    - health_check
    - pprof
  pipelines:
    logs/1:
      exporters:
        - otlp/newrelic
      processors:
        - resourcedetection/system
        - batch
      receivers:
        - otlp
    metrics/1:
      exporters:
        - otlp/newrelic
        - logging
      processors:
        - resourcedetection/system
        - batch
      receivers:
        - otlp
    traces/1:
      exporters:
        - otlp/newrelic
        - logging
      processors:
        - filter/newrelic_and_otel
        - resourcedetection/system
        - tail_sampling
        - batch/sampled
      receivers:
        - otlp
  telemetry:
    metrics:
      address: 0.0.0.0:8888

Log output

No response

Additional context

Fetching profile over HTTP from http://localhost:1777/debug/pprof/heap
Saved profile in /Users/jayson.reis/pprof/pprof.otelcol-contrib.alloc_objects.alloc_space.inuse_objects.inuse_space.016.pb.gz
File: otelcol-contrib
Type: inuse_space
Time: Apr 2, 2024 at 5:15pm (CEST)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof) tree
Showing nodes accounting for 3092.52MB, 97.48% of 3172.59MB total
Dropped 214 nodes (cum <= 15.86MB)
----------------------------------------------------------+-------------
      flat  flat%   sum%        cum   cum%   calls calls% + context
----------------------------------------------------------+-------------
                                          763.45MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.createTracesProcessor
  762.95MB 24.05% 24.05%   763.45MB 24.06%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.newTracesProcessor
----------------------------------------------------------+-------------
                                         2024.57MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).ConsumeTraces
  753.56MB 23.75% 47.80%  2024.57MB 63.81%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).processTraces
                                         1267.50MB 62.61% |   sync.(*Map).LoadOrStore
----------------------------------------------------------+-------------
                                             663MB   100% |   sync.(*Map).LoadOrStore
     663MB 20.90% 68.70%      663MB 20.90%                | sync.(*Map).dirtyLocked
----------------------------------------------------------+-------------
                                         1267.50MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).processTraces
  557.50MB 17.57% 86.27%  1267.50MB 39.95%                | sync.(*Map).LoadOrStore
                                             663MB 52.31% |   sync.(*Map).dirtyLocked
                                              47MB  3.71% |   sync.newEntry (inline)
----------------------------------------------------------+-------------
                                             200MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).samplingPolicyOnTick (inline)
  144.50MB  4.55% 90.83%      200MB  6.30%                | go.opentelemetry.io/collector/pdata/ptrace.NewTraces
                                           55.50MB 27.75% |   go.opentelemetry.io/collector/pdata/ptrace.newTraces (inline)
----------------------------------------------------------+-------------
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/ottlspan.NewTransformContext (inline)
      63MB  1.99% 92.81%       63MB  1.99%                | go.opentelemetry.io/collector/pdata/pcommon.NewMap
----------------------------------------------------------+-------------
                                           55.50MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.NewTraces (inline)
   55.50MB  1.75% 94.56%    55.50MB  1.75%                | go.opentelemetry.io/collector/pdata/ptrace.newTraces
----------------------------------------------------------+-------------
                                              47MB   100% |   sync.(*Map).LoadOrStore (inline)
      47MB  1.48% 96.04%       47MB  1.48%                | sync.newEntry
----------------------------------------------------------+-------------
                                              37MB 82.22% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.init.Upsert.func2
                                               8MB 17.78% |   go.opencensus.io/tag.(*mutator).Mutate
      45MB  1.42% 97.46%       45MB  1.42%                | go.opencensus.io/tag.createMetadatas
----------------------------------------------------------+-------------
                                              46MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).samplingPolicyOnTick
    0.50MB 0.016% 97.48%       46MB  1.45%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).makeDecision
                                              45MB 97.83% |   go.opencensus.io/stats.RecordWithTags
----------------------------------------------------------+-------------
                                          252.99MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/timeutils.(*PolicyTicker).Start.func1 (inline)
         0     0% 97.48%   252.99MB  7.97%                | github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/timeutils.(*PolicyTicker).OnTick
                                          252.99MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).samplingPolicyOnTick
----------------------------------------------------------+-------------
         0     0% 97.48%   252.99MB  7.97%                | github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/timeutils.(*PolicyTicker).Start.func1
                                          252.99MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/timeutils.(*PolicyTicker).OnTick (inline)
----------------------------------------------------------+-------------
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1.1 (inline)
         0     0% 97.48%       63MB  1.99%                | github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/ottlspan.NewTransformContext
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/pcommon.NewMap (inline)
----------------------------------------------------------+-------------
                                              63MB   100% |   go.opentelemetry.io/collector/processor/processorhelper.NewTracesProcessor.func1
         0     0% 97.48%       63MB  1.99%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.ResourceSpansSlice.RemoveIf
----------------------------------------------------------+-------------
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.ResourceSpansSlice.RemoveIf
         0     0% 97.48%       63MB  1.99%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.ScopeSpansSlice.RemoveIf
----------------------------------------------------------+-------------
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.ScopeSpansSlice.RemoveIf
         0     0% 97.48%       63MB  1.99%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.SpanSlice.RemoveIf
----------------------------------------------------------+-------------
                                              63MB   100% |   go.opentelemetry.io/collector/pdata/ptrace.SpanSlice.RemoveIf
         0     0% 97.48%       63MB  1.99%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1.1
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/pkg/ottl/contexts/ottlspan.NewTransformContext (inline)
----------------------------------------------------------+-------------
                                         2024.57MB   100% |   go.opentelemetry.io/collector/processor/processorhelper.NewTracesProcessor.func1
         0     0% 97.48%  2024.57MB 63.81%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).ConsumeTraces
                                         2024.57MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).processTraces
----------------------------------------------------------+-------------
                                          252.99MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/internal/coreinternal/timeutils.(*PolicyTicker).OnTick
         0     0% 97.48%   252.99MB  7.97%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).samplingPolicyOnTick
                                             200MB 79.06% |   go.opentelemetry.io/collector/pdata/ptrace.NewTraces (inline)
                                              46MB 18.18% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).makeDecision
----------------------------------------------------------+-------------
                                          763.45MB   100% |   go.opentelemetry.io/collector/processor.CreateTracesFunc.CreateTracesProcessor
         0     0% 97.48%   763.45MB 24.06%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.createTracesProcessor
                                          763.45MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.newTracesProcessor
----------------------------------------------------------+-------------
                                              37MB   100% |   go.opencensus.io/tag.(*mutator).Mutate
         0     0% 97.48%       37MB  1.17%                | github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.init.Upsert.func2
                                              37MB   100% |   go.opencensus.io/tag.createMetadatas
----------------------------------------------------------+-------------
                                          764.95MB   100% |   main.runInteractive
         0     0% 97.48%   764.95MB 24.11%                | github.com/spf13/cobra.(*Command).Execute
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).ExecuteC
----------------------------------------------------------+-------------
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).Execute
         0     0% 97.48%   764.95MB 24.11%                | github.com/spf13/cobra.(*Command).ExecuteC
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).execute
----------------------------------------------------------+-------------
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).ExecuteC
         0     0% 97.48%   764.95MB 24.11%                | github.com/spf13/cobra.(*Command).execute
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.NewCommand.func1
----------------------------------------------------------+-------------
                                              45MB   100% |   go.opencensus.io/stats.RecordWithTags
         0     0% 97.48%       45MB  1.42%                | go.opencensus.io/stats.RecordWithOptions
                                              45MB   100% |   go.opencensus.io/tag.New
----------------------------------------------------------+-------------
                                              45MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).makeDecision
         0     0% 97.48%       45MB  1.42%                | go.opencensus.io/stats.RecordWithTags
                                              45MB   100% |   go.opencensus.io/stats.RecordWithOptions
----------------------------------------------------------+-------------
                                              45MB   100% |   go.opencensus.io/tag.New
         0     0% 97.48%       45MB  1.42%                | go.opencensus.io/tag.(*mutator).Mutate
                                              37MB 82.22% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.init.Upsert.func2
                                               8MB 17.78% |   go.opencensus.io/tag.createMetadatas
----------------------------------------------------------+-------------
                                              45MB   100% |   go.opencensus.io/stats.RecordWithOptions
         0     0% 97.48%       45MB  1.42%                | go.opencensus.io/tag.New
                                              45MB   100% |   go.opencensus.io/tag.(*mutator).Mutate
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/config/configgrpc.(*GRPCServerSettings).toServerOption.enhanceWithClientInformation.func9
                                         2088.07MB   100% |   go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
                                         2025.07MB 96.98% |   go.opentelemetry.io/collector/processor/processorhelper.NewTracesProcessor.func1
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
                                         2088.07MB   100% |   go.opentelemetry.io/collector/processor/processorhelper.NewTracesProcessor.func1
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
                                         2088.07MB   100% |   go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.NewCommand.func1
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/otelcol.(*Collector).Run
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.(*Collector).setupConfigurationComponents
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.(*Collector).Run
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/otelcol.(*Collector).setupConfigurationComponents
                                          764.95MB   100% |   go.opentelemetry.io/collector/service.New
----------------------------------------------------------+-------------
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).execute
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/otelcol.NewCommand.func1
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.(*Collector).Run
----------------------------------------------------------+-------------
                                         2100.32MB   100% |   google.golang.org/grpc.(*Server).processUnaryRPC
         0     0% 97.48%  2100.32MB 66.20%                | go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler
                                         2088.07MB 99.42% |   go.opentelemetry.io/collector/config/configgrpc.(*GRPCServerSettings).toServerOption.enhanceWithClientInformation.func9
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/config/configgrpc.(*GRPCServerSettings).toServerOption.enhanceWithClientInformation.func9
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1
                                         2088.07MB   100% |   go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export
----------------------------------------------------------+-------------
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces
         0     0% 97.48%       63MB  1.99%                | go.opentelemetry.io/collector/pdata/ptrace.ResourceSpansSlice.RemoveIf
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1
----------------------------------------------------------+-------------
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1
         0     0% 97.48%       63MB  1.99%                | go.opentelemetry.io/collector/pdata/ptrace.ScopeSpansSlice.RemoveIf
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1
----------------------------------------------------------+-------------
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1
         0     0% 97.48%       63MB  1.99%                | go.opentelemetry.io/collector/pdata/ptrace.SpanSlice.RemoveIf
                                              63MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces.func1.1.1
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export
                                         2088.07MB   100% |   go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export
----------------------------------------------------------+-------------
                                          763.45MB   100% |   go.opentelemetry.io/collector/service/internal/graph.(*processorNode).buildComponent
         0     0% 97.48%   763.45MB 24.06%                | go.opentelemetry.io/collector/processor.(*Builder).CreateTraces
                                          763.45MB   100% |   go.opentelemetry.io/collector/processor.CreateTracesFunc.CreateTracesProcessor
----------------------------------------------------------+-------------
                                          763.45MB   100% |   go.opentelemetry.io/collector/processor.(*Builder).CreateTraces
         0     0% 97.48%   763.45MB 24.06%                | go.opentelemetry.io/collector/processor.CreateTracesFunc.CreateTracesProcessor
                                          763.45MB   100% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.createTracesProcessor
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/processor/processorhelper.NewTracesProcessor.func1
                                         2025.07MB 96.98% |   go.opentelemetry.io/collector/consumer.ConsumeTracesFunc.ConsumeTraces
                                         2024.57MB 96.96% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor.(*tailSamplingSpanProcessor).ConsumeTraces
                                              63MB  3.02% |   github.com/open-telemetry/opentelemetry-collector-contrib/processor/filterprocessor.(*filterSpanProcessor).processTraces
----------------------------------------------------------+-------------
                                         2088.07MB   100% |   go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export
         0     0% 97.48%  2088.07MB 65.82%                | go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export
                                         2088.07MB   100% |   go.opentelemetry.io/collector/internal/fanoutconsumer.(*tracesConsumer).ConsumeTraces
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/service.New
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/service.(*Service).initExtensionsAndPipeline
                                          764.95MB   100% |   go.opentelemetry.io/collector/service/internal/graph.Build
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/otelcol.(*Collector).setupConfigurationComponents
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/service.New
                                          764.95MB   100% |   go.opentelemetry.io/collector/service.(*Service).initExtensionsAndPipeline
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/service/internal/graph.Build
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/service/internal/graph.(*Graph).buildComponents
                                          763.45MB 99.80% |   go.opentelemetry.io/collector/service/internal/graph.(*processorNode).buildComponent
----------------------------------------------------------+-------------
                                          763.45MB   100% |   go.opentelemetry.io/collector/service/internal/graph.(*Graph).buildComponents
         0     0% 97.48%   763.45MB 24.06%                | go.opentelemetry.io/collector/service/internal/graph.(*processorNode).buildComponent
                                          763.45MB   100% |   go.opentelemetry.io/collector/processor.(*Builder).CreateTraces
----------------------------------------------------------+-------------
                                          764.95MB   100% |   go.opentelemetry.io/collector/service.(*Service).initExtensionsAndPipeline
         0     0% 97.48%   764.95MB 24.11%                | go.opentelemetry.io/collector/service/internal/graph.Build
                                          764.95MB   100% |   go.opentelemetry.io/collector/service/internal/graph.(*Graph).buildComponents
----------------------------------------------------------+-------------
                                         2102.36MB   100% |   google.golang.org/grpc.(*Server).serveStreams.func1.1
         0     0% 97.48%  2102.36MB 66.27%                | google.golang.org/grpc.(*Server).handleStream
                                         2102.36MB   100% |   google.golang.org/grpc.(*Server).processUnaryRPC
----------------------------------------------------------+-------------
                                         2102.36MB   100% |   google.golang.org/grpc.(*Server).handleStream
         0     0% 97.48%  2102.36MB 66.27%                | google.golang.org/grpc.(*Server).processUnaryRPC
                                         2100.32MB 99.90% |   go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler
----------------------------------------------------------+-------------
         0     0% 97.48%  2102.36MB 66.27%                | google.golang.org/grpc.(*Server).serveStreams.func1.1
                                         2102.36MB   100% |   google.golang.org/grpc.(*Server).handleStream
----------------------------------------------------------+-------------
                                          764.45MB 99.93% |   runtime.main
         0     0% 97.48%   764.95MB 24.11%                | main.main
                                          764.95MB   100% |   main.run (inline)
----------------------------------------------------------+-------------
                                          764.95MB   100% |   main.main (inline)
         0     0% 97.48%   764.95MB 24.11%                | main.run
                                          764.95MB   100% |   main.runInteractive
----------------------------------------------------------+-------------
                                          764.95MB   100% |   main.run
         0     0% 97.48%   764.95MB 24.11%                | main.runInteractive
                                          764.95MB   100% |   github.com/spf13/cobra.(*Command).Execute
----------------------------------------------------------+-------------
                                           36.95MB   100% |   runtime.main (inline)
         0     0% 97.48%    36.95MB  1.16%                | runtime.doInit
                                           36.95MB   100% |   runtime.doInit1
----------------------------------------------------------+-------------
                                           36.95MB   100% |   runtime.doInit
         0     0% 97.48%    36.95MB  1.16%                | runtime.doInit1
----------------------------------------------------------+-------------
         0     0% 97.48%   801.40MB 25.26%                | runtime.main
                                          764.45MB 95.39% |   main.main
                                           36.95MB  4.61% |   runtime.doInit (inline)
----------------------------------------------------------+-------------

This was originally reported on #29762

github-actions[bot] commented 5 months ago

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 3 months ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

jaysonsantos commented 3 months ago

This still happens on current versions

Jonatthu commented 1 month ago

This is still happening even with providers like Sentry on nodejs