traceloop / openllmetry

Open-source observability for your LLM application, based on OpenTelemetry
https://www.traceloop.com/openllmetry
Apache License 2.0
3.48k stars 675 forks source link

Logging LLM provider request ids as `gen_ai` attributes #2236

Open nirga opened 3 weeks ago

nirga commented 3 weeks ago

Discussed in https://github.com/traceloop/openllmetry/discussions/2174

Originally posted by **dinmukhamedm** October 18, 2024 It would be very useful for debugging purposes (e.g. with OpenAI support) to have a unique identifier to an LLM call span. Has OpenLLMetry considered adding something like `gen_ai.request.id` attribute? The biggest challenge I see with this is that they are not unified, and formatted differently across providers and even across endpoints of a single provider (e.g. completions vs assistant). Generally, there is a request-wide unique ID though, and if there is none, the attribute can obviously remain optional.
dinmukhamedm commented 6 days ago

@nirga for this, we'll need to first uncomment https://github.com/traceloop/openllmetry/blob/main/packages/opentelemetry-semantic-conventions-ai/opentelemetry/semconv_ai/__init__.py#L53 and release a new version of opentelemetry.semconv_ai. Could you please do that? I am happy to try and add this to some of the providers/endpoints afterwards.