BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

(Feat) DB Schema change - add `custom_llm_provider` in SpendLogs #6823

Closed ishaan-jaff closed 13 hours ago

ishaan-jaff commented 2 days ago

Title

Relevant issues

Type

๐Ÿ†• New Feature ๐Ÿ› Bug Fix ๐Ÿงน Refactoring ๐Ÿ“– Documentation ๐Ÿš„ Infrastructure โœ… Test

Changes

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

vercel[bot] commented 2 days ago

The latest updates on your projects. Learn more about Vercel for Git โ†—๏ธŽ

Name Status Preview Comments Updated (UTC)
litellm โœ… Ready (Inspect) Visit Preview ๐Ÿ’ฌ Add feedback Nov 20, 2024 4:28am
krrishdholakia commented 1 day ago

@ishaan-jaff is this necessary?

We log the 'model_id' which should allow us to get the custom_llm_provider for any specific log

Asking as i believe we should be trying to minimize size of table

ishaan-jaff commented 13 hours ago

Asking as i believe we should be trying to minimize size of table

Yeah I agree with your feedback, I don't think this is necessary. Will wait for the user to complain if this is really something they need. Since they can track this on prometheus already