Open AlessandroFazio opened 2 months ago
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself.
After looking more in details w3c tracing protocol I figured out that the version part in the traceparent header value is just one byte, so assuming another byte is taken for '-', the substring function should start reading from byte at index 2 until 16 bytes are read. I tested with 2 as starting index instead of 3, but this results in the same error.
My guess is that the header traceparent itself when set as attribute by envoy access log service in the telemetry payload is represented with wrong encoding and so the error propagate downstream in data processing. I didn't find anything about hex encoding conversion in the util funcitons in OTTL.
We have https://github.com/open-telemetry/opentelemetry-collector-contrib/issues/31929 to track a Hex
converter, but I believe all you'll need to do is adjust your Substring
command:
processors:
transform/envoyal:
error_mode: ignore
log_statements:
- context: log
statements:
- set(attributes["trace-id"], Substring(attributes["traceparent"], 3, 32))
- set(attributes["parent-id"], Substring(attributes["traceparent"], 36, 16))
- set(trace_id.string, attributes["trace-id"])
- set(span_id.string, attributes["parent-id"])
That should capture 1e2c3bff3f92d79cf8d1140f6af32f69
as the trace ID and c784b919dc6de35e
as the span ID.
This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers
. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.
Pinging code owners:
See Adding Labels via Comments if you do not have permissions to add labels yourself.
Component(s)
processor/transform
What happened?
Description
Hello, I have have setup a pipeline with the transform processor to extract TraceID and SpanId data from traceparent http header and set the trace_id and span_id in the LogContext.
The key here is:
Steps to Reproduce
Expected Result
This is the payload I expect:
Actual Result
This is the payload I expect:
Hope you can help me with this. Thanks in advance for your time.
Collector version
0.97.0
Environment information
Os: Ubuntu 22.04
OpenTelemetry Collector configuration
Log output
{"level":"error","ts":1712330461.6687455,"caller":"logs/processor.go:54","msg":"failed processing logs","kind":"processor","name":"transform/envoyal","pipeline":"logs","error":"failed to execute statement: set(trace_id.string, attributes[\"trace-id\"]), trace ids must be 32 hex characters","stacktrace":"github.com/open-telemetry/opentelemetry-collector-contrib/processor/transformprocessor/internal/logs.(Processor).ProcessLogs\n\tgithub.com/open-telemetry/opentelemetry-collector-contrib/processor/transformprocessor@v0.97.0/internal/logs/processor.go:54\ngo.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1\n\tgo.opentelemetry.io/collector/processor@v0.97.0/processorhelper/logs.go:48\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/consumer@v0.97.0/logs.go:25\ngo.opentelemetry.io/collector/processor/processorhelper.NewLogsProcessor.func1\n\tgo.opentelemetry.io/collector/processor@v0.97.0/processorhelper/logs.go:56\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/consumer@v0.97.0/logs.go:25\ngo.opentelemetry.io/collector/consumer.ConsumeLogsFunc.ConsumeLogs\n\tgo.opentelemetry.io/collector/consumer@v0.97.0/logs.go:25\ngo.opentelemetry.io/collector/internal/fanoutconsumer.(logsConsumer).ConsumeLogs\n\tgo.opentelemetry.io/collector@v0.97.0/internal/fanoutconsumer/logs.go:62\ngo.opentelemetry.io/collector/receiver/otlpreceiver/internal/logs.(Receiver).Export\n\tgo.opentelemetry.io/collector/receiver/otlpreceiver@v0.97.0/internal/logs/otlp.go:41\ngo.opentelemetry.io/collector/pdata/plog/plogotlp.rawLogsServer.Export\n\tgo.opentelemetry.io/collector/pdata@v1.4.0/plog/plogotlp/grpc.go:88\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler.func1\n\tgo.opentelemetry.io/collector/pdata@v1.4.0/internal/data/protogen/collector/logs/v1/logs_service.pb.go:311\ngo.opentelemetry.io/collector/config/configgrpc.(ServerConfig).toServerOption.enhanceWithClientInformation.func9\n\tgo.opentelemetry.io/collector/config/configgrpc@v0.97.0/configgrpc.go:398\ngo.opentelemetry.io/collector/pdata/internal/data/protogen/collector/logs/v1._LogsService_Export_Handler\n\tgo.opentelemetry.io/collector/pdata@v1.4.0/internal/data/protogen/collector/logs/v1/logs_service.pb.go:313\ngoogle.golang.org/grpc.(Server).processUnaryRPC\n\tgoogle.golang.org/grpc@v1.62.1/server.go:1386\ngoogle.golang.org/grpc.(Server).handleStream\n\tgoogle.golang.org/grpc@v1.62.1/server.go:1797\ngoogle.golang.org/grpc.(*Server).serveStreams.func2.1\n\tgoogle.golang.org/grpc@v1.62.1/server.go:1027"}
Additional context