Open patrickwasp opened 5 months ago
The language model made an error in formulating the arguments to the function exchange_rate
:
Arguments:
{
"name": "currency_calculator",
"arguments": {
"base_amount": 123.45,
"quote_currency": "USD"
}
}
This above is clearly incorrect. The correct argument in JSON should just the object in arguments
above.
You can take a look at expected output of tool call messages: https://microsoft.github.io/autogen/docs/tutorial/tool-use#using-tool
cc @marklysze for awareness.
Hey @patrickwasp, for function calling with LiteLLM you'll need to use "ollama_chat/" instead of "ollama/" in the model name (or command line).
So ollama/dolphincoder:8k
should be ollama_chat/dolphincoder:8k
Hey @patrickwasp, did this help solve your function calling issue?
Describe the issue
When I try to run the example listed at https://microsoft.github.io/autogen/docs/topics/non-openai-models/local-litellm-ollama#example-with-function-calling it can't find the function. The previous simpler example without function calling (https://microsoft.github.io/autogen/docs/topics/non-openai-models/local-litellm-ollama#example-with-function-calling) works as expected.
This is my environment:
docker-compose.yaml
Steps to reproduce
Screenshots and logs
output
Additional Information
autogenstudio==0.0.56 python 3.12 ubuntu 22.04