Open ewoolsey opened 10 months ago
The logs should be correlated automatically.. Are you not seeing that behavior?
Yes - this should be correlated automatically, as long as the application is instrumented using opentelemetry tracing (either through tracing-opentelemetry or directly using opentelemetry tracing api). Not if it is using any other (non-otel) tracing subscriber.
This is the information I'm getting in loki, with no trace_id information.
Logs are being generated with this code
let logs = opentelemetry_otlp::new_pipeline()
.logging()
.with_exporter(opentelemetry_otlp::new_exporter().http())
.with_log_config(
opentelemetry_sdk::logs::config().with_resource(Resource::new(vec![
KeyValue::new("service.name", "liquidation-bot-rs"),
])),
)
.install_batch(opentelemetry_sdk::runtime::Tokio)?;
let logging = OpenTelemetryTracingBridge::new(&logs.provider().unwrap())
.with_filter(EnvFilter::from_default_env());
subscriber = Registry::default()
.with(stdout)
.with(logging)
.with(paper_trail)
.with(metrics)
.with(tracing)
Yes - this should be correlated automatically, as long as the application is instrumented using opentelemetry tracing (either through tracing-opentelemetry or directly using opentelemetry tracing api). Not if it is using any other (non-otel) tracing subscriber.
I can confirm things are properly instrumented with tracing-opentelemetry
. The log in the screenshot above is inside a function with #[instrument]
and I am correctly seeing traces and events.
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself.
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself.
Or just the log output from otel collector
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself.
This code somehow results in no output...
let exporter = opentelemetry_stdout::LogExporter::default();
let logs = LoggerProvider::builder()
.with_simple_exporter(exporter)
.build();
let logging = OpenTelemetryTracingBridge::new(&logs);
collector logs are scarce
otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.702Z info service@v0.88.0/telemetry.go:84 Setting up own telemetry...
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.702Z info service@v0.88.0/telemetry.go:201 Serving Prometheus metrics {"address": ":8888", "level": "Basic"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.703Z info lokiexporter@v0.88.0/exporter.go:41 using the new Loki exporter {"kind": "exporter", "data_type": "logs", "name": "loki"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.707Z info service@v0.88.0/service.go:143 Starting otelcol-contrib... {"Version": "0.88.0", "NumCPU": 8}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:33 Starting extensions...
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/server"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/server"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/traces"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/traces"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/metrics"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/metrics"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/logs"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.708Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/logs"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.709Z warn internal@v0.88.0/warning.go:40 Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks {"kind": "receiver", "name": "otlp", "data_type": "metrics", "documentation": "https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.709Z info otlpreceiver@v0.88.0/otlp.go:101 Starting HTTP server {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "0.0.0.0:4318"}
[otel-collector] [2023-11-16 07:19:57] 2023-11-16T07:19:57.710Z info service@v0.88.0/
Follow this example as-is https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/README.md#manual
You should see logs correctly correlated to traces https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/src/main.rs#L126
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself.
This code somehow results in no output...
let exporter = opentelemetry_stdout::LogExporter::default(); let logs = LoggerProvider::builder() .with_simple_exporter(exporter) .build(); let logging = OpenTelemetryTracingBridge::new(&logs);
Do you know what's going on here? Something about that is blocking all output on the tracing subscriber.
collector logs are scarce
I should have been more precise, apologies. You need "logging" exporter in collector to see the actual data it receives. https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/otel-collector-config.yaml#L14
Can you use stdout exporter and share the output? We just want to confirm if this issue is with OTLP or Loki itself.
This code somehow results in no output...
let exporter = opentelemetry_stdout::LogExporter::default(); let logs = LoggerProvider::builder() .with_simple_exporter(exporter) .build(); let logging = OpenTelemetryTracingBridge::new(&logs);
Do you know what's going on here? Something about that is blocking all output on the tracing subscriber.
Start with the example from the repo. That should work as-is. And you can tweak it to match your setup. https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-appender-tracing/examples/basic.rs
debug logs
2023-11-16T18:21:31.684Z info service@v0.88.0/telemetry.go:84 Setting up own telemetry...
2023-11-16T18:21:31.685Z info service@v0.88.0/telemetry.go:201 Serving Prometheus metrics {"address": ":8888", "level": "Basic"}
2023-11-16T18:21:31.685Z info exporter@v0.88.0/exporter.go:275 Deprecated component. Will be removed in future releases. {"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-11-16T18:21:31.685Z warn common/factory.go:68 'loglevel' option is deprecated in favor of 'verbosity'. Set 'verbosity' to equivalent value to preserve behavior. {"kind": "exporter", "data_type": "logs", "name": "logging", "loglevel": "debug", "equivalent verbosity level": "Detailed"}
2023-11-16T18:21:31.685Z info service@v0.88.0/service.go:143 Starting otelcol-contrib... {"Version": "0.88.0", "NumCPU": 10}
2023-11-16T18:21:31.685Z info extensions/extensions.go:33 Starting extensions...
2023-11-16T18:21:31.685Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/logs"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/logs"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/server"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/server"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/traces"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/traces"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:36 Extension is starting... {"kind": "extension", "name": "basicauth/metrics"}
2023-11-16T18:21:31.685Z info extensions/extensions.go:43 Extension started. {"kind": "extension", "name": "basicauth/metrics"}
2023-11-16T18:21:31.686Z warn internal@v0.88.0/warning.go:40 Using the 0.0.0.0 address exposes this server to every network interface, which may facilitate Denial of Service attacks {"kind": "receiver", "name": "otlp", "data_type": "metrics", "documentation": "https://github.com/open-telemetry/opentelemetry-collector/blob/main/docs/security-best-practices.md#safeguards-against-denial-of-service-attacks"}
2023-11-16T18:21:31.686Z info otlpreceiver@v0.88.0/otlp.go:101 Starting HTTP server {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "0.0.0.0:4318"}
2023-11-16T18:21:31.686Z info service@v0.88.0/service.go:169 Everything is ready. Begin running and processing data.
2023-11-16T18:21:37.557Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2023-11-16T18:21:37.558Z info ResourceLog #0
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:37.443094 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: DEBUG
SeverityNumber: Debug(5)
Body: Str(Initializing client with url: wss://testnet.tm.injective.network/websocket)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
2023-11-16T18:21:38.573Z info LogsExporter {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 5, "log records": 5}
2023-11-16T18:21:38.574Z info ResourceLog #0
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:38.154245 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: TRACE
SeverityNumber: Trace(1)
Body: Str(syncing data)
Trace ID:
Span ID:
Flags: 0
ResourceLog #1
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:38.261633 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: TRACE
SeverityNumber: Trace(1)
Body: Str(syncing markets)
Attributes:
-> height: Str(18494177)
Trace ID:
Span ID:
Flags: 0
ResourceLog #2
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:38.261741 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: TRACE
SeverityNumber: Trace(1)
Body: Str(syncing collaterals)
Attributes:
-> height: Str(18494177)
Trace ID:
Span ID:
Flags: 0
ResourceLog #3
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:38.261778 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: TRACE
SeverityNumber: Trace(1)
Body: Str(syncing oracle asset details)
Attributes:
-> height: Str(18494177)
Trace ID:
Span ID:
Flags: 0
ResourceLog #4
Resource SchemaURL:
Resource attributes:
-> service.name: Str(liquidation-bot-rs)
ScopeLogs #0
ScopeLogs SchemaURL:
InstrumentationScope opentelemetry-appender-tracing 0.2.0
LogRecord #0
ObservedTimestamp: 2023-11-16 18:21:38.370362 +0000 UTC
Timestamp: 1970-01-01 00:00:00 +0000 UTC
SeverityText: TRACE
SeverityNumber: Trace(1)
Body: Str(syncing markets)
Attributes:
-> height: Str(18494178)
Trace ID:
Span ID:
Flags: 0
{"kind": "exporter", "data_type": "logs", "name": "logging"}
Looks like the trace ids are being omitted.
Follow this example as-is https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/README.md#manual
You should see logs correctly correlated to traces https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/src/main.rs#L126
@ewoolsey Could you start with the above example (which we know is working as expected), and progressively make changes to it to match your setup? That'll help us identify root-cause quickly.
Follow this example as-is https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/README.md#manual You should see logs correctly correlated to traces https://github.com/open-telemetry/opentelemetry-rust/blob/main/opentelemetry-otlp/examples/basic-otlp/src/main.rs#L126
@ewoolsey Could you start with the above example (which we know is working as expected), and progressively make changes to it to match your setup? That'll help us identify root-cause quickly.
I just did, and that example doesn't emit a trace_id either, even when inside a span.
code
use opentelemetry::KeyValue;
use opentelemetry_appender_tracing::layer;
use opentelemetry_sdk::{
logs::{Config, LoggerProvider},
Resource,
};
use tracing::{error, span};
use tracing_core::Level;
use tracing_subscriber::prelude::*;
fn main() {
let exporter = opentelemetry_stdout::LogExporter::default();
let provider: LoggerProvider = LoggerProvider::builder()
.with_config(
Config::default().with_resource(Resource::new(vec![KeyValue::new(
"service.name",
"log-appender-tracing-example",
)])),
)
.with_simple_exporter(exporter)
.build();
let layer = layer::OpenTelemetryTracingBridge::new(&provider);
tracing_subscriber::registry().with(layer).init();
let my_span = span!(Level::TRACE, "my_span");
// `my_span` exists but has not been entered.
// Enter `my_span`...
let _enter = my_span.enter();
error!(name: "my-event-name", target: "my-system", event_id = 20, user_name = "otel", user_email = "otel@opentelemetry.io");
drop(provider);
}
output
{"resourceLogs":[{"resource":{"attributes":[{"key":"service.name","value":{"stringValue":"log-appender-tracing-example"}}]},"scopeLogs":[{"scope":{"name":"opentelemetry-appender-tracing","version":"0.2.0"},"logR
ecords":[{"timeUnixNano":null,"time":null,"observedTimeUnixNano":1700160228053182000,"observedTime":"2023-11-16 18:43:48.053","severityNumber":17,"severityText":"ERROR","attributes":[{"key":"name","value":{"stri
ngValue":"my-event-name"}},{"key":"event_id","value":{"intValue":20}},{"key":"user_name","value":{"stringValue":"otel"}},{"key":"user_email","value":{"stringValue":"otel@opentelemetry.io"}}],"droppedAttributesCo
unt":0}]}]}]}
I don't see the tracing-opentelemetry subscriber configured in the above example before emitting spans.
I don't see the tracing-opentelemetry subscriber configured in the above example before emitting spans.
ah right I just blindly ran that example since you linked it.
Anyhow, I found the root cause.
This does not emit a trace id
#[instrument]
pub async fn my_func() {
trace!("fun log");
}
But this does
#[instrument]
pub async fn my_func() {
let tracer = opentelemetry::global::tracer("ex.com/basic");
tracer.in_span("cool span", |_| {
trace!("fun log");
});
}
Should #[instrument]
not be taking care of this for me?
Should #[instrument] not be taking care of this for me?
That is a question for tracing-opentelemetry repo! It should have worked, but might need additional setups/config, which are better asked in tracing-opentelemetry repo.
Should #[instrument] not be taking care of this for me?
That is a question for tracing-opentelemetry repo! It should have worked, but might need additional setups/config, which are better asked in tracing-opentelemetry repo.
I can't have been the only person to run into this, seems like a pretty common use case. I suppose I can open up an issue over there.
Thanks so much for your help talking through this! Appreciate your time. I'll mark this issue as closed here.
Sorry I think I need to reopen this. I'm pretty sure the issue lies with the opentelemetry-appender-tracing
crate. Looking at its code, nowhere does it use the LogRecord
trace_context
field, which I believe is required to make this feature work correctly, no?
opentelemetry-appender-tracing
expects the parent's trace-context to be stored in the Otel Context , while tracing-opentelemetry
manages the parent-child relationship of spans primarily within the tracing
context, and it is not translated in OpenTelemetry Context
for logs. Will have a look into this further, though not sure where the changes lies.
opentelemetry-appender-tracing
expects the parent's trace-context to be stored in the Otel Context , whiletracing-opentelemetry
manages the parent-child relationship of spans primarily within thetracing
context, and it is not translated in OpenTelemetryContext
for logs. Will have a look into this further, though not sure where the changes lies.
Yeah I think that's an accurate description.
I would think that opentelemetry-appender-tracing
should be doing the work here but this ecosystem is complicated enough that I won't claim to know how things should work haha.
opentelemetry-appender-tracing
expects the parent's trace-context to be stored in the Otel Context , whiletracing-opentelemetry
manages the parent-child relationship of spans primarily within thetracing
context, and it is not translated in OpenTelemetryContext
for logs. Will have a look into this further, though not sure where the changes lies.
I changed the name of this issue to reflect what I think should be the expected behaviour.
@lalitb I need to try and get this going for my work, would you be able to help me out with a quick workaround? Within the on_event
trait method inside opentelemetry-appender-tracing
, how exactly should I be extracting the trace_id
from the tracing ctx?. Here's what I have so far but I can't quite figure out the very last step.
impl<'a, S, P, L> Layer<S> for OpenTelemetryTracingBridge<P, L>
where
for<'lookup> S: tracing_subscriber::registry::LookupSpan<'lookup>,
S: tracing::Subscriber,
P: LoggerProvider<Logger = L> + Send + Sync + 'static,
L: Logger + Send + Sync + 'static,
{
fn on_event(&self, event: &tracing::Event<'_>, ctx: tracing_subscriber::layer::Context<'_, S>) {
let meta = event.metadata();
let mut log_record: LogRecord = LogRecord::default();
log_record.severity_number = Some(severity_of_level(meta.level()));
log_record.severity_text = Some(meta.level().to_string().into());
if let Some(current_span) = ctx.lookup_current() {
if let Some(mut otel_data) = current_span.extensions_mut().remove::<OtelData>() {
log_record.with_context(otel_data.parent_cx);
}
}
...
@lalitb I need to try and get this going for my work, would you be able to help me out with a quick workaround? Within the
on_event
trait method insideopentelemetry-appender-tracing
, how exactly should I be extracting thetrace_id
from the tracing ctx?. Here's what I have so far but I can't quite figure out the very last step.impl<'a, S, P, L> Layer<S> for OpenTelemetryTracingBridge<P, L> where for<'lookup> S: tracing_subscriber::registry::LookupSpan<'lookup>, S: tracing::Subscriber, P: LoggerProvider<Logger = L> + Send + Sync + 'static, L: Logger + Send + Sync + 'static, { fn on_event(&self, event: &tracing::Event<'_>, ctx: tracing_subscriber::layer::Context<'_, S>) { let meta = event.metadata(); let mut log_record: LogRecord = LogRecord::default(); log_record.severity_number = Some(severity_of_level(meta.level())); log_record.severity_text = Some(meta.level().to_string().into()); if let Some(current_span) = ctx.lookup_current() { if let Some(mut otel_data) = current_span.extensions_mut().remove::<OtelData>() { log_record.with_context(otel_data.parent_cx); } } ...
@ewoolsey with_context
is defined for LogRecordBuilder, so won't work here. Did you try directly assigning to log_record.trace_context
as done in #1394? I believe it should work.
Also, please remember this will call both OpenTelemetryTracingBridge::on_event()
and OpenTelemetryLayer::on_event()
, so the logs would be exported both as SpanEvent
and LogRecord
.
Oh that's amazing I didn't realize a PR was already being worked on! I'm sure that'll be exactly what I need. Thank you!
Has there been any progress on this? Even just a workaround? not been able to find any good examples for implementing and have tried 5 or so different possible solutions to no avail.
Is there an accepted way to correlate logs & traces when using tracing_subscriber
? As a hack I'm using opentelemetry-appender-tracing
along with tracing-opentelemetry
to capture both traces and logs which is inefficient. Is there a better way?
If I only use tracing-opentelemetry
I get the traces ingested but log events can't set the proper level because they're technically ingested as traces.
If anyone has some working code they can share that correlates log events with traces I'd very much appreciate it.
@ewoolsey did you manage to find a workaround for this or forked the library somehow? Would love to piggyback on yours :-D
I also stumbled over this issue and assumed that the tracing instrument macros work with the opentelemetry framework. So I hacked something together here https://github.com/jpramosi/opentelemetry-rust/. It seems to work for a few little examples, but dunno if there are any pitfalls or other new issues? The signoz trace to log correlation and the otel collector logs looks at least promising for traces and logs (metrics not tested):
'LogRecord' got trace and span ids now
If someone would like to test it, append this to the Cargo.toml file:
[dependencies]
# ...
opentelemetry-tracing = { version = "0.25" }
[patch.crates-io]
opentelemetry = { package = "opentelemetry", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-appender-tracing = { package = "opentelemetry-appender-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-http = { package = "opentelemetry-http", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-otlp = { package = "opentelemetry-otlp", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry_sdk = { package = "opentelemetry_sdk", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
opentelemetry-tracing = { package = "opentelemetry-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
The fork includes tracing-opentelemetry but was renamed to 'opentelemetry-tracing'. So it needs to be refactored in your code or it doesn't work at all. A working example can be seen here.
I also stumbled over this issue and assumed that the tracing instrument macros work with the opentelemetry framework. So I hacked something together here https://github.com/jpramosi/opentelemetry-rust/. It seems to work for a few little examples, but dunno if there are any pitfalls or other new issues? The signoz trace to log correlation and the otel collector logs looks at least promising for traces and logs (metrics not tested):
Before After 'LogRecord' got trace and span ids now
If someone would like to test it, append this to the Cargo.toml file:
[dependencies] # ... opentelemetry-tracing = { version = "0.25" } [patch.crates-io] opentelemetry = { package = "opentelemetry", git = "https://github.com/jpramosi/opentelemetry-rust.git" } opentelemetry-appender-tracing = { package = "opentelemetry-appender-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" } opentelemetry-http = { package = "opentelemetry-http", git = "https://github.com/jpramosi/opentelemetry-rust.git" } opentelemetry-otlp = { package = "opentelemetry-otlp", git = "https://github.com/jpramosi/opentelemetry-rust.git" } opentelemetry_sdk = { package = "opentelemetry_sdk", git = "https://github.com/jpramosi/opentelemetry-rust.git" } opentelemetry-tracing = { package = "opentelemetry-tracing", git = "https://github.com/jpramosi/opentelemetry-rust.git" }
The fork includes tracing-opentelemetry but was renamed to 'opentelemetry-tracing'. So it needs to be refactored in your code or it doesn't work at all. A working example can be seen here.
This is amazing! Thank you for making the changes and sharing your code. It's working for me.
I got it working after doing the switch from tracing-opentelemetry
to opentelemetry-tracing
as you suggested. I also bumped by version of opentelemtry
down to 0.24.0
as that's the same as the one you released.
It's great to be able to correlate logs with traces as is possible with the Grafana Tempo+Loki combo. This could be some sort of configuration option likely.