open-telemetry / opentelemetry-collector-contrib

Contrib repository for the OpenTelemetry Collector
https://opentelemetry.io
Apache License 2.0
2.9k stars 2.27k forks source link

k8sattributes processor attributes not appearing in Azure Monitor Logs #24210

Closed zparnold closed 8 months ago

zparnold commented 1 year ago

Component(s)

exporter/azuremonitor

What happened?

Description

When I use the logging exporter I can see the k8sattributes values added in the collector logs. But when I add the azuremonitor exporter, there are no k8sattributes in the log lines. I expect to see them in the customDimensions but they are nowhere.

Steps to Reproduce

Expected Result

To have the K8s pod name and other attributes present in the customDimensions section

Actual Result

image

(There are no k8sattributes to be found anywhere)

Collector version

v0.80.0

Environment information

Environment

OpenTelemetryOperator deployed collector

OpenTelemetry Collector configuration

apiVersion: opentelemetry.io/v1alpha1
kind: OpenTelemetryCollector
metadata:
  name: otel
  namespace: default
spec:
  mode: daemonset
  hostNetwork: true
  serviceAccount: collector
  env:
    - name: KUBE_NODE_NAME
      valueFrom:
        fieldRef:
          apiVersion: v1
          fieldPath: spec.nodeName
  config: |
    receivers:
      otlp:
        protocols:
          grpc:
          http:
    processors:
      batch:
      k8sattributes:
    exporters:
      azuremonitor:
        instrumentation_key: xxxxxxxxxxxxx
      logging:
        verbosity: detailed
    extensions:
        zpages:
        health_check:
    service:
      extensions: [zpages, health_check]
      pipelines:
        traces:
          receivers: [otlp]
          processors: [k8sattributes, batch]
          exporters: [azuremonitor]
        logs:
          receivers: [otlp]
          processors: [k8sattributes, batch]
          exporters: [azuremonitor, logging]
        metrics:
          receivers: [otlp]
          processors: [k8sattributes, batch]
          exporters: [azuremonitor]

Log output

otel-collector-bwhnc otc-container 2023-07-11T14:27:32.624Z     info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 6}
otel-collector-bwhnc otc-container 2023-07-11T14:27:32.624Z     info    ResourceLog #0
otel-collector-bwhnc otc-container Resource SchemaURL: https://opentelemetry.io/schemas/1.20.0
otel-collector-bwhnc otc-container Resource attributes:
otel-collector-bwhnc otc-container      -> container.id: Str(fda1b2f2263d79ceb923e17da9db2a22b1af5bccc413edd1b99804ce085041cb)
otel-collector-bwhnc otc-container      -> host.arch: Str(amd64)
otel-collector-bwhnc otc-container      -> host.name: Str(otel-agent-test-85657f98cf-97c77)
otel-collector-bwhnc otc-container      -> os.description: Str(Linux 5.4.0-42-generic)
otel-collector-bwhnc otc-container      -> os.type: Str(linux)
otel-collector-bwhnc otc-container      -> process.command_args: Slice(["/usr/lib/jvm/java-17-openjdk-amd64/bin/java","-jar","/app.jar"])
otel-collector-bwhnc otc-container      -> process.executable.path: Str(/usr/lib/jvm/java-17-openjdk-amd64/bin/java)
otel-collector-bwhnc otc-container      -> process.pid: Int(7)
otel-collector-bwhnc otc-container      -> process.runtime.description: Str(Private Build OpenJDK 64-Bit Server VM 17.0.7+7-Ubuntu-0ubuntu120.04)
otel-collector-bwhnc otc-container      -> process.runtime.name: Str(OpenJDK Runtime Environment)
otel-collector-bwhnc otc-container      -> process.runtime.version: Str(17.0.7+7-Ubuntu-0ubuntu120.04)
otel-collector-bwhnc otc-container      -> service.name: Str(otel-agent-test)
otel-collector-bwhnc otc-container      -> telemetry.auto.version: Str(1.27.0)
otel-collector-bwhnc otc-container      -> telemetry.sdk.language: Str(java)
otel-collector-bwhnc otc-container      -> telemetry.sdk.name: Str(opentelemetry)
otel-collector-bwhnc otc-container      -> telemetry.sdk.version: Str(1.27.0)
otel-collector-bwhnc otc-container      -> k8s.pod.ip: Str(10.240.226.19)
otel-collector-bwhnc otc-container      -> k8s.pod.name: Str(otel-agent-test-85657f98cf-97c77)
otel-collector-bwhnc otc-container      -> k8s.namespace.name: Str(kube-system)
otel-collector-bwhnc otc-container      -> k8s.pod.start_time: Str(2023-07-10 19:30:45 +0000 UTC)
otel-collector-bwhnc otc-container      -> k8s.pod.uid: Str(f032e276-6839-4c0c-8047-cc0c47df10c5)
otel-collector-bwhnc otc-container      -> k8s.deployment.name: Str(otel-agent-test)
otel-collector-bwhnc otc-container      -> k8s.node.name: Str(qn8-dm-kwiap104)
otel-collector-bwhnc otc-container      -> container.image.name: Str(creguie1bmrepo001.azurecr.io/architecture/otel-test)
otel-collector-bwhnc otc-container      -> container.image.tag: Str(latest)
otel-collector-bwhnc otc-container ScopeLogs #0
otel-collector-bwhnc otc-container ScopeLogs SchemaURL:
otel-collector-bwhnc otc-container InstrumentationScope com.msci.index.otelagentdemo.RollController
otel-collector-bwhnc otc-container LogRecord #0
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.340106624 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.34 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: INFO
otel-collector-bwhnc otc-container SeverityNumber: Info(9)
otel-collector-bwhnc otc-container Body: Str(Anonymous player is rolling the dice: 2)
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: 20e07ebb30a4091f
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container ScopeLogs #1
otel-collector-bwhnc otc-container ScopeLogs SchemaURL:
otel-collector-bwhnc otc-container InstrumentationScope org.springframework.web.servlet.mvc.method.annotation.RequestResponseBodyMethodProcessor
otel-collector-bwhnc otc-container LogRecord #0
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.340881268 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.34 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: DEBUG
otel-collector-bwhnc otc-container SeverityNumber: Debug(5)
otel-collector-bwhnc otc-container Body: Str(Using 'text/html', given [text/html, application/xhtml+xml, image/webp, image/apng, application/xml;q=0.9, */*;q=0.8, application/signed-exchange;v=b3;q=0.7] and supported [text/plain, */*, application/json, application/*+json])
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: 20e07ebb30a4091f
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container LogRecord #1
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.341067229 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.341 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: DEBUG
otel-collector-bwhnc otc-container SeverityNumber: Debug(5)
otel-collector-bwhnc otc-container Body: Str(Writing ["2"])
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: 20e07ebb30a4091f
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container ScopeLogs #2
otel-collector-bwhnc otc-container ScopeLogs SchemaURL:
otel-collector-bwhnc otc-container InstrumentationScope org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping
otel-collector-bwhnc otc-container LogRecord #0
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.339495631 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.339 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: DEBUG
otel-collector-bwhnc otc-container SeverityNumber: Debug(5)
otel-collector-bwhnc otc-container Body: Str(Mapped to com.msci.index.otelagentdemo.RollController#index(Optional))
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: ef482aef3435b87d
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container ScopeLogs #3
otel-collector-bwhnc otc-container ScopeLogs SchemaURL:
otel-collector-bwhnc otc-container InstrumentationScope org.springframework.web.servlet.DispatcherServlet
otel-collector-bwhnc otc-container LogRecord #0
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.338889173 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.338 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: DEBUG
otel-collector-bwhnc otc-container SeverityNumber: Debug(5)
otel-collector-bwhnc otc-container Body: Str(GET "/rolldice", parameters={})
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: ef482aef3435b87d
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container LogRecord #1
otel-collector-bwhnc otc-container ObservedTimestamp: 2023-07-11 14:27:32.341944354 +0000 UTC
otel-collector-bwhnc otc-container Timestamp: 2023-07-11 14:27:32.341 +0000 UTC
otel-collector-bwhnc otc-container SeverityText: DEBUG
otel-collector-bwhnc otc-container SeverityNumber: Debug(5)
otel-collector-bwhnc otc-container Body: Str(Completed 200 OK)
otel-collector-bwhnc otc-container Trace ID: e4dc6e8a9aa18f44d11361791ce9ffa9
otel-collector-bwhnc otc-container Span ID: ef482aef3435b87d
otel-collector-bwhnc otc-container Flags: 1
otel-collector-bwhnc otc-container      {"kind": "exporter", "data_type": "logs", "name": "logging"}

Additional context

No response

github-actions[bot] commented 1 year ago

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

zparnold commented 1 year ago

For additional context, when we add elasticsearch/log as an exporter, we do get those attributes: image

pcwiese commented 1 year ago

@bastbu Can you take a look here? I know you made a change a few months ago to include resource attributes in the log exporter...

bastbu commented 1 year ago

I tried to reproduce it, but for me all resource attributes originating from the k8sattributesprocessor show up correctly in Application Insights.

From the original description I can see that the k8s.pod.ip, which comes from the k8sattributesprocessor, shows up correctly in Application Insights.

The Elastic Resource dimensions do not show the net.host.name or net.protocol.name dimensions which should be visible as well, if the Resource.* fields are alphabetically ordered. Can we confirm that the configuration for the two collectors is the same and that it's not a configuration issue?

crobert-1 commented 1 year ago

/label waiting-for-author

github-actions[bot] commented 10 months ago

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

github-actions[bot] commented 8 months ago

This issue has been closed as inactive because it has been stale for 120 days with no activity.