BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.52k stars 1.7k forks source link

[Bug]: TypeError: can only concatenate str (not "dict") to str #6958

Open linsida1 opened 4 days ago

linsida1 commented 4 days ago

What happened?

environment

Infomation

I use autogen+litellm+ollama for my local test.
when doing some tool_call, litellm raise ERROR in token_counter method : TypeError: can only concatenate str (not "dict") to str.

I debug it with vscode , I found the function_arguments is dict , not str.

Can some one help check it is a bug ? Can just change it from (which at line 1638 of utils.py file): text += function_arguments to text += str(function_arguments)

Relevant log output

Error processing publish message
Traceback (most recent call last):
  File ".venv/lib/python3.11/site-packages/litellm/main.py", line 481, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 612, in ollama_acompletion
    raise e  # don't use verbose_logger.exception, if exception is raised
    ^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 594, in ollama_acompletion
    prompt_tokens = response_json.get("prompt_eval_count", litellm.token_counter(messages=data["messages"]))  # type: ignore
                                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1638, in token_counter
    text += function_arguments
TypeError: can only concatenate str (not "dict") to str

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File ".venv/lib/python3.11/site-packages/autogen_core/application/_single_threaded_agent_runtime.py", line 402, in _process_publish
    await asyncio.gather(*responses)
  File ".venv/lib/python3.11/site-packages/autogen_core/application/_single_threaded_agent_runtime.py", line 394, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/autogen_core/components/_routed_agent.py", line 484, in on_message
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/autogen_core/components/_routed_agent.py", line 148, in wrapper
    return_value = await func(self, message, ctx)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "src/autogen_service/ag_core/hand_offs.py", line 57, in handle_task
    llm_result = await self._model_client.create(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "src/autogen_service/ag_exts/models/litellm/_litellm_client.py", line 432, in create
    result: Union[ParsedChatCompletion[BaseModel], ChatCompletion] = await future
                                                                     ^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1175, in wrapper_async
    raise e
  File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1031, in wrapper_async
    result = await original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/main.py", line 503, in acompletion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2136, in exception_type
    raise e
  File ".venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2112, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: can only concatenate str (not "dict") to str
Traceback (most recent call last):
  File ".venv/lib/python3.11/site-packages/litellm/main.py", line 481, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 612, in ollama_acompletion
    raise e  # don't use verbose_logger.exception, if exception is raised
    ^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/llms/ollama_chat.py", line 594, in ollama_acompletion
    prompt_tokens = response_json.get("prompt_eval_count", litellm.token_counter(messages=data["messages"]))  # type: ignore
                                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.11/site-packages/litellm/utils.py", line 1638, in token_counter
    text += function_arguments
TypeError: can only concatenate str (not "dict") to str

Twitter / LinkedIn details

No response

krrishdholakia commented 3 days ago

can you share the request being made to litellm for repro?

angelnu commented 1 day ago

I also got the same issue using ollama_chat/llama3.1 running locally (Meta Llama 3.1 8B Instruct). Interestingly ollama/llama3.1 did not have the same issue - I expect because it does not track consumed tokens.

Adding str(function_arguments) solved the issue.

linsida1 commented 20 hours ago

can you share the request being made to litellm for repro?

@krrishdholakia yes . I make a simple demo to reproduce this error. please check itl,Thank you.

litellm_ollama_chat_test.txt