BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.08k stars 1.66k forks source link

[Feature]: Allow streaming for o1 #6801

Closed Clad3815 closed 18 hours ago

Clad3815 commented 3 days ago

The Feature

The streaming mode is now available for o1 mini & preview, we don't need to fake it anymore

Motivation, pitch

https://x.com/OpenAIDevs/status/1858609150999359559

Twitter / LinkedIn details

No response