BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

[Bug]: Setting "add_function_to_prompt" does not bring any effect #6829

Open haoshan98 opened 1 day ago

haoshan98 commented 1 day ago

What happened?

Setting litellm.add_function_to_prompt = True does not work.

It should work as the documentation stated: For Models/providers without function calling support, LiteLLM allows you to add the function to the prompt set: litellm.add_function_to_prompt = True

Relevant log output

No response

Twitter / LinkedIn details

No response