getsentry / sentry

Developer-first error tracking and performance monitoring
https://sentry.io
Other
38.52k stars 4.12k forks source link

LLM Monitoring Pipeline tokens not showing #74638

Open lambopancake opened 1 month ago

lambopancake commented 1 month ago

Environment

SaaS (https://sentry.io/)

What are you trying to accomplish?

I want the "Total tokens used" to display on the columns and graph of the individual trace.

Image

How are you getting stuck?

"Total tokens used" displays on the LLM Monitoring tab. But the total tokens does not display on the individual pipeline. I'm currently using nodeJS. The initial trace starts with a transaction op: "ai.pipeline", then a child span op:"ai.chat_completions.create.openai". The tokens are then displayed using span.setData("ai.total_tokens.used",response.usage.total_tokens);, but only shows in the LLM Monitoring tab.

Image

Where in the product are you?

Performance

Link

No response

Version

sentry/node: 7.118.0

getsantry[bot] commented 1 month ago

Assigning to @getsentry/support for routing ⏲️

colin-sentry commented 1 month ago

The child span needs ai.pipeline.name set to External.openai.prompt in this case - that's how the token usage is linked back.

getsantry[bot] commented 1 month ago

Routing to @getsentry/product-owners-performance for triage ⏲️