BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.68k stars 1.48k forks source link

[Feature]: Support `store`, `metadata`, and `service_tier` on OpenAI #6022

Open Manouchehri opened 3 days ago

Manouchehri commented 3 days ago

The Feature

We should support the new store, metadata, and service_tier parameters on OpenAI requests.

https://platform.openai.com/docs/api-reference/chat/create#chat-create-store

Motivation, pitch

I want to use store with OpenAI with LiteLLM.

Twitter / LinkedIn details

https://www.linkedin.com/in/davidmanouchehri/

krrishdholakia commented 3 days ago

Hey @Manouchehri store and service_tier should already work with litellm - i believe they'd just be treated as provider-specific params, and passed straight through - https://docs.litellm.ai/docs/completion/provider_specific_params

Manouchehri commented 3 days ago

Oh interesting, you're right, just tested store and it works.