Closed HaroonSaid closed 8 months ago
Unknown
Memory leaking
We saw memory leaks in 0.91.0 and upgraded to 0.94.0 and continued to see leakage. The config
0.91.0
0.94.0
receivers: otlp: protocols: grpc: http: processors: resourcedetection: detectors: - env - ecs batch/traces: send_batch_size: 50 send_batch_max_size: 1000 batch/metrics: send_batch_max_size: 200 send_batch_size: 50 timeout: 10s memory_limiter: check_interval: 1s limit_mib: 500 spike_limit_mib: 200 exporters: logging: loglevel: error prometheus: endpoint: :8889 otlp: endpoint: "${env:OTLP_HOST}" tls: ca_file: /etc/ca.crt cert_file: /etc/client.pem key_file: /etc/client.key insecure_skip_verify: true prometheusremotewrite: endpoint: "${env:PROMETHEUSREMOTEWRITE}" resource_to_telemetry_conversion: enabled: true extensions: health_check: pprof: endpoint: :1888 zpages: endpoint: :55679 service: extensions: [pprof, zpages, health_check] pipelines: traces: receivers: [otlp] processors: [resourcedetection,batch/traces] exporters: [otlp] metrics: receivers: [otlp] processors: [resourcedetection,batch/metrics] exporters: [prometheusremotewrite]
OS: Linux (AWS Linux2) Compiler(if manually compiled): (e.g., "go 14.2")
No response
Incorrect config
Component(s)
Unknown
What happened?
Memory leaking
Description
We saw memory leaks in
0.91.0
and upgraded to0.94.0
and continued to see leakage. The configCollector version
0.94.0
Environment information
Environment
OS: Linux (AWS Linux2) Compiler(if manually compiled): (e.g., "go 14.2")
OpenTelemetry Collector configuration
No response
Log output
No response
Additional context
No response