BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.6k stars 1.71k forks source link

[Feature]: Add column to Readme table showing tool calling support #4194

Open aantn opened 5 months ago

aantn commented 5 months ago

The Feature

Title says it all

Motivation, pitch

I want to understand if LiteLLM meets my requirements

Twitter / LinkedIn details

@aantn https://www.linkedin.com/in/natanyellin

krrishdholakia commented 5 months ago

hey @aantn you can check which models support tool calling with litellm.supports_function_calling(model="") -> returns True if model supports Function calling, False if not

https://docs.litellm.ai/docs/completion/function_call