open-telemetry / opentelemetry-java-instrumentation

OpenTelemetry auto-instrumentation and instrumentation libraries for Java
https://opentelemetry.io
Apache License 2.0
1.87k stars 819 forks source link

Support fine granularity of LLM metadata attribute for LLM spans #11312

Open mxiamxia opened 3 months ago

mxiamxia commented 3 months ago

Is your feature request related to a problem? Please describe.

For users who send the LLM request events to a specific model, It is expected that the auto-instrumented trace spans should provide fine granularity of LLM metadata including Token Spent, Model Name/Provider, Prompt Completion Reason, etc, by adhering to LLM span semantic convention submitted by LLM WG.

Describe the solution you'd like

I can contribute and start with AWS Bedrock by updating aws-sdk-2.2 to add BedrockRuntime as a new supported type in AwsSdkRequestType. And then mondify AwsSdkExperimentalAttributesExtractor to capture LLM specific attribute from Bedrock API request and response for the following attributes.

  1. Model Id/Name
  2. Prompt Token
  3. Output Token
  4. Model Provider
  5. Completion Reason
  6. Top_p
  7. Top_K
  8. Temperature
  9. Max_tokens

Bedrock supports multiple LLM models, will start with the following

  1. Amazon Titan
  2. Anthropic Claude
  3. Meta Llama

Describe alternatives you've considered

No response

Additional context

No response

laurit commented 3 months ago

We'd welcome a contribution for this