BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

[Bug]: `huggingface/Mistral-Small-Instruct-2409` support function calling but not according to the litellm api #6847

Open fguich opened 22 hours ago

fguich commented 22 hours ago

What happened?

According to the mistralai doc : https://huggingface.co/mistralai/Mistral-Small-Instruct-2409 Mistral-Small-Instruct-2409 does support the function calling. However, litellm.supports_funciton_calling say the opposite, and I'm afraid that prevent from using the function calling with this model throw the litellm api.

Relevant log output

import litellm
litellm.supports_function_calling(model = "huggingface/mistralai/Mistral-Small-Instruct-2409")
# -> return False

Twitter / LinkedIn details

No response