Closed Rikkedi closed 2 years ago
Actually, it wouldn't work even with this minimal setup:
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
processors:
batch/metrics:
timeout: 1s
send_batch_size: 50
exporters:
logging/metrics:
loglevel: debug
service:
pipelines:
metrics:
receivers: [otlp]
processors: [batch/metrics]
exporters: [logging/metrics]
Similar environment, except:
Node.js v17.8.0
and docker-compose
usage:
services:
adot-collector:
image: public.ecr.aws/aws-observability/aws-otel-collector:latest
command: ['--config=config.yml']
volumes:
- ${PWD}/tools/collector/dev/config.yml:/config.yml
...
Code (and none of the static or dynamic labels show up):
const requestsCount = meter.createCounter('http_requests_total', {
description: 'Total amount of HTTP requests',
constantAttributes: STATIC_LABELS
});
// later in a request handler
const labels = {
method: req.method,
route: req.route.path,
statusCode: res.statusCode,
service: SERVICE_NAME
};
requestsCount.add(1, labels);
Output (without docker-compose prefix):
ResourceMetrics #0
Resource SchemaURL:
Resource labels:
-> service.name: STRING(some-service)
-> telemetry.sdk.language: STRING(nodejs)
-> telemetry.sdk.name: STRING(opentelemetry)
-> telemetry.sdk.version: STRING(1.2.0)
InstrumentationLibraryMetrics #0
InstrumentationLibraryMetrics SchemaURL:
InstrumentationLibrary some-service
Metric #0
Descriptor:
-> Name: http_requests_total
-> Description: Total amount of HTTP requests
-> Unit: 1
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: AGGREGATION_TEMPORALITY_CUMULATIVE
NumberDataPoints #0
StartTimestamp: 2022-05-15 01:47:59.073999872 +0000 UTC
Timestamp: 2022-05-15 01:48:11.881179904 +0000 UTC
Value: 2.000000
Switching to otel-collector (exact same config file):
services:
otel-collector:
image: otel/opentelemetry-collector:latest
command: ['--config=config.yml']
volumes:
- ${PWD}/tools/collector/dev/config.yml:/config.yml
Note
latest
tag which hasn't been updated in 8 months
produces:
ResourceMetrics #0
Resource labels:
-> service.name: STRING(some-service)
-> telemetry.sdk.language: STRING(nodejs)
-> telemetry.sdk.name: STRING(opentelemetry)
-> telemetry.sdk.version: STRING(1.2.0)
InstrumentationLibraryMetrics #0
InstrumentationLibrary some-service
Metric #0
Descriptor:
-> Name: http_requests_total
-> Description: Total amount of HTTP requests
-> Unit: 1
-> DataType: Sum
-> IsMonotonic: true
-> AggregationTemporality: AGGREGATION_TEMPORALITY_CUMULATIVE
NumberDataPoints #0
Data point attributes:
-> method: STRING(POST)
-> route: STRING(/some-route)
-> statusCode: STRING(200)
-> service: STRING(some-service)
StartTimestamp: 2022-05-15 01:53:50.724 +0000 UTC
Timestamp: 2022-05-15 01:55:25.685500416 +0000 UTC
Value: 3.000000
I just tried again with the recently released v0.18.0 otel-collector image and still having this issue.
Hi @Rikkedi ,
Thank you for providing us with some steps to replicate. We will take a look at this and see if we can identify the root cause of the issue. When we have more information we will reach out on this issue.
Hi @bryan-aguilar, any news on the root cause here? Thanks in advance for taking a look.
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days.
I'm able to reproduce this with the configuration mentioned.
@bryan-aguilar Is there an update you can provide?
Hi,
We do recognize this as an issue but afaik no one is actively working on this. We provide GitHub support on a best effort basis.
I did notice that the latest comment from @dnutels reproduced this with a pretty barebones ADOT Collector Config. Can we confirm if this also fails with the barebone collector config while using the latest upstream collector? Note that the latest
tag is not updated upstream so you will have to target otel/opentelemetry-collector:0.58.0
. If this still fails with the config and the latest version of the upstream collector I would advise to open an issue upstream. If this passes with the latest upstream version but fails with the latest ADOT Collector then we can reevaluate the priority. Also please ensure that you are using the latest JS SDK version.
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
processors:
batch/metrics:
timeout: 1s
send_batch_size: 50
exporters:
logging/metrics:
loglevel: debug
service:
pipelines:
metrics:
receivers: [otlp]
processors: [batch/metrics]
exporters: [logging/metrics]
@bryan-aguilar I've had much better luck with newer versions of both the collector side-car and the OTel libraries. Labels are working with aws-otel-collector v0.20.0 and the following OTel versions:
"@opentelemetry/api": "1.1.0",
"@opentelemetry/exporter-metrics-otlp-grpc": "0.32.0",
"@opentelemetry/sdk-metrics-base": "0.31.0",
Describe the bug I'm adding instrumentation to a NestJS service (running in ECS Fargate) using the OTLPMetricExporter from '@opentelemetry/exporter-metrics-otlp-grpc' and then running awsemf exporter as a sidecar to pipe these to CloudWatch in Embedded Metrics Format. However, I can't get any of my metric attributes to show up as dimensions in emf, not in the log awsemf debug logs or in the generated metrics stream.
Steps to reproduce Sample code for the instrumentation
What did you expect to see? I expect to see metrics in EMF like the following
What did you see instead? Instead, I get the metrics that come through the Resource object, but none of my custom attributes.
Environment The collector runs as a sidecar to an ECS Fargate task. config.yaml
I'm mounting the config to the container by generating my own image from
aws-otel-collector
Node environment:
Additional context Running the otlp-grpc exporter with debug logging, I can see the attributes being sent: