BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.28k stars 1.15k forks source link

[Feature]: Add streaming support for ClarifAI #4162

Open ishaan-jaff opened 3 weeks ago

ishaan-jaff commented 3 weeks ago

The Feature

@mogith-pn Can you show me a curl on how to make a streaming request to ClarifAI. This is not mentioned on a single place in your docs or on the swagger API

Motivation, pitch

Twitter / LinkedIn details

No response

mogith-pn commented 3 weeks ago

The Feature

@mogith-pn Can you show me a curl on how to make a streaming request to ClarifAI. This is not mentioned on a single place in your docs or on the swagger API

Motivation, pitch

Twitter / LinkedIn details

No response

@ishaan-jaff , Currently clarifai doesn't support streaming, although it's internally being done but unsure on timelines for the public preview release !

mogith-pn commented 3 weeks ago

That's why as per krish's suggestion, we wrapped the completion response to iterator, to ensure it follows stream response format. https://github.com/BerriAI/litellm/pull/3369#issuecomment-2090775919