BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

(feat) Add usage tracking for streaming `/anthropic` passthrough routes #6842

Closed ishaan-jaff closed 5 hours ago

ishaan-jaff commented 1 day ago

Adds usage tracking + spend tracking + unit testing for pass through anthropic routes

Relevant issues

Type

๐Ÿ†• New Feature ๐Ÿ› Bug Fix ๐Ÿงน Refactoring ๐Ÿ“– Documentation ๐Ÿš„ Infrastructure โœ… Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Description by Korbit AI

What change is being made?

Add usage tracking and logging for /anthropic passthrough routes in the proxy by introducing a convert_str_chunk_to_generic_chunk function and creating an AnthropicPassthroughLoggingHandler class, which processes Anthropic responses, constructs logging payloads, and integrates them into the existing streaming infrastructure.

Why are these changes being made?

To improve monitoring and understanding of Anthropic API usage by providing detailed logging, to align with existing logging patterns for other endpoints, and to enhance transparency and debugging capabilities for requests passing through the proxy. This refactoring aims to modularize and streamline the passthrough handling, making the logging process more efficient and maintainable.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

vercel[bot] commented 1 day ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 22, 2024 3:36am