open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
2.91k stars 2.27k forks source link

otlp collector keep crashing after enable redaction for log #35316

Open pretystar opened 2 hours ago

pretystar commented 2 hours ago

Component(s)

processor/redaction

What happened?

Description

otlp collector keep crashing after enable redaction for log

Steps to Reproduce

Try to enable redaction with below config:

  processors:
    redaction:
      allow_all_keys: true
      summary: debug
  service:
    pipelines:
      logs:
        receivers:
          - otlp
        exporters:
          - debug
          - azuremonitor
        processors: 
          - attributes/log
          - redaction

Expected Result

Actual Result

Pod keep crashing with log:

Collector version

0.109.0

Environment information

Environment

OS: (e.g., "Ubuntu 20.04") Compiler(if manually compiled): (e.g., "go 14.2")

OpenTelemetry Collector configuration

No response

Log output

2024-09-20T08:26:45.108Z    info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 6}
panic: runtime error: index out of range [1] with length 1

goroutine 858 [running]:
go.opentelemetry.io/collector/pdata/plog.LogRecordSlice.At(...)
    go.opentelemetry.io/collector/pdata@v1.15.0/plog/generated_logrecordslice.go:56
github.com/open-telemetry/opentelemetry-collector-contrib/processor/redactionprocessor.(*redaction).processResourceLog(0xc00320f620, {0xa1b9438, 0xc002ea3b60}, {0xc008e27440?, 0xc001e6b3e8?})
    github.com/open-telemetry/opentelemetry-collector-contrib/processor/redactionprocessor@v0.109.0/processor.go:110 +0x125
github.com/open-telemetry/opentelemetry-collector-contrib/processor/redactionprocessor.(*redaction).processLogs(0xc00320f620, {0xa1b9438, 0xc002ea3b60}, {0xc0028f47e0?, 0xc001e6b3e8?})
    github.com/open-telemetry/opentelemetry-collector-contrib/processor/redactionprocessor@v0.109.0/processor.go:67 +0x4d
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1({0xa1b9438, 0xc002ea3b60}, {0xc0028f47e0?, 0xc001e6b3e8?})
    go.opentelemetry.io/collector/processor@v0.109.0/processorhelper/logs.go:57 +0x13e
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
    go.opentelemetry.io/collector/consumer@v0.109.0/logs.go:26
go.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1({0xa1b9438, 0xc002ea3b60}, {0xc0028f47e0?, 0xc001e6b3e8?})
    go.opentelemetry.io/collector/processor@v0.109.0/processorhelper/logs.go:67 +0x2a2
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
    go.opentelemetry.io/collector/consumer@v0.109.0/logs.go:26
go.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs(...)
    go.opentelemetry.io/collector/consumer@v0.109.0/logs.go:26
go.opentelemetry.io/collector/internal/fanoutconsumer.(*logsConsumer).ConsumeLogs(0xc0035452c0, {0xa1b9438, 0xc002ea3b60}, {0xc0028f47e0?, 0xc001e6b3e8?})
    go.opentelemetry.io/collector@v0.109.0/internal/fanoutconsumer/logs.go:62 +0x1e7
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/logs.(*Receiver).Export(0xc003552f30, {0xa1b9438, 0xc002ea3ad0}, {0xc0028f47e0?, 0xc001e6b3e8?})
    go.opentelemetry.io/collector/receiver/otlpreceiver@v0.109.0/internal/logs/otlp.go:41 +0xd9
go.opentelemetry.io/collector/pdata/plog/plogotlp.rawLogsServer.Export({{0xa158408?, 0xc003552f30?}}, {0xa1b9438, 0xc002ea3ad0}, 0xc0028f47e0)
    go.opentelemetry.io/collector/pdata@v1.15.0/plog/plogotlp/grpc.go:88 +0xea
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler.func1({0xa1b9438?, 0xc002ea3ad0?}, {0x8e6f280?, 0xc0028f47e0?})
    go.opentelemetry.io/collector/pdata@v1.15.0/internal/data/protogen/collector/logs/v1/logs_service.pb.go:311 +0xcb
go.opentelemetry.io/collector/config/configgrpc.(*ServerConfig).toServerOption.enhanceWithClientInformation.func9({0xa1b9438?, 0xc002ea3a70?}, {0x8e6f280, 0xc0028f47e0}, 0x4111a5?, 0xc0028f47f8)
    go.opentelemetry.io/collector/config/configgrpc@v0.109.0/configgrpc.go:459 +0x46
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler({0x7cb7520, 0xc0035a8ab0}, {0xa1b9438, 0xc002ea3a70}, 0xc0026fde00, 0xc00356afe0)
    go.opentelemetry.io/collector/pdata@v1.15.0/internal/data/protogen/collector/logs/v1/logs_service.pb.go:313 +0x143
google.golang.org/grpc.(*Server).processUnaryRPC(0xc003261200, {0xa1b9438, 0xc002ea3980}, {0xa202580, 0xc000201040}, 0xc001dbf320, 0xc0035bd3b0, 0x105ce5f0, 0x0)
    google.golang.org/grpc@v1.66.0/server.go:1393 +0xe11
google.golang.org/grpc.(*Server).handleStream(0xc003261200, {0xa202580, 0xc000201040}, 0xc001dbf320)
    google.golang.org/grpc@v1.66.0/server.go:1804 +0xe8b
google.golang.org/grpc.(*Server).serveStreams.func2.1()
    google.golang.org/grpc@v1.66.0/server.go:1029 +0x7f
created by google.golang.org/grpc.(*Server).serveStreams.func2 in goroutine 223
    google.golang.org/grpc@v1.66.0/server.go:1040 +0x125

Additional context

No response

github-actions[bot] commented 2 hours ago

Pinging code owners:

qrli commented 2 hours ago

Possible reason: image