open-telemetry / opentelemetry-dotnet

The OpenTelemetry .NET Client
https://opentelemetry.io
Apache License 2.0
3.18k stars 753 forks source link

How can I transform metrics received through another channel to OTLP metrics? #3845

Open unwork-ag opened 1 year ago

unwork-ag commented 1 year ago

Discussed in https://github.com/open-telemetry/opentelemetry-dotnet/discussions/3466

Originally posted by **unwork-ag** July 21, 2022 A couple of years back we have built a much simplified home-grown kind of collector and now want to integrate that with OpenTelemetry. Our collector receives json data through REST calls and one type of data is performance information. We would like to send this to an OTel collector using OTLP as metric data (i.e. histogram) but we are not sure how to feed the data that we receive into the OTel pipeline. Things we have considered so far: - Creating a meter in our own collector and adding histograms for each type of performance record we get. This makes the record look like it was created in the receiving service (our custom collector) which is wrong. We have all the original data available - but we didn't find a way to put this into the metrics data. - Building a custom metrics reader that takes our data and transforms it into OpenTelemetry metrics. This looked promising but we unfortunately we cannot construct the Metrics type that we need to pass to the exporter (ctor is internal: https://github.com/open-telemetry/opentelemetry-dotnet/blob/23609730ddd73c860553de847e67c9b2226cff94/src/OpenTelemetry/Metrics/Metric.cs) I think we are not approaching this correctly. Does someone have a suggestion how to do this the right way?
mikoskinen commented 1 year ago

Hi,

Did you find a solution to this?

unwork-ag commented 1 year ago

Unfortunately not. A similar question is asked here which was responded to with a reference to some specification work regarding metric-reader bridges. I don't think that this is available already in opentelemetry dotnet.

mikoskinen commented 1 year ago

Thank you for your prompt response. I appreciate it.