Open linsida1 opened 4 days ago
can you share the request being made to litellm for repro?
I also got the same issue using ollama_chat/llama3.1 running locally (Meta Llama 3.1 8B Instruct). Interestingly ollama/llama3.1 did not have the same issue - I expect because it does not track consumed tokens.
Adding str(function_arguments) solved the issue.
can you share the request being made to litellm for repro?
@krrishdholakia yes . I make a simple demo to reproduce this error. please check itl,Thank you.
What happened?
environment
Infomation
I use autogen+litellm+ollama for my local test.
when doing some tool_call, litellm raise ERROR in token_counter method : TypeError: can only concatenate str (not "dict") to str.
I debug it with vscode , I found the function_arguments is dict , not str.
Can some one help check it is a bug ? Can just change it from (which at line 1638 of utils.py file): text += function_arguments to text += str(function_arguments)
Relevant log output
Twitter / LinkedIn details
No response