I have a chatbot that calls a function with no arguments. Calling the function with OpenAI models works just fine. When using Gemini 1.5 pro, the tool call is made and the results are written to the message history, but upon feeding the messages with the tool call results to the LLM, I get this error:
Exception: Unable to convert openai tool calls={'role': 'assistant', 'content': '', 'tool_calls': [{'id': 'call_2c384bc6-de46-
4f29-8adc-60dd5805d305', 'function': {'name': 'Get-FAQ', 'arguments': '{}'}, 'type': 'function'}]} to gemini tool calls.
Received error=function_call missing. Received tool call with 'type': 'function'. No function call in argument - {'id':
'call_2c384bc6-de46-4f29-8adc-60dd5805d305', 'function': {'name': 'Get-FAQ', 'arguments': '{}'}, 'type': 'function'}
Looks like the problem is in litellm/llms/prompt_templates:
def _gemini_tool_call_invoke_helper(
function_call_params: ChatCompletionToolCallFunctionChunk,
) -> Optional[litellm.types.llms.vertex_ai.FunctionCall]:
name = function_call_params.get("name", "") or ""
arguments = function_call_params.get("arguments", "")
arguments_dict = json.loads(arguments)
function_call: Optional[litellm.types.llms.vertex_ai.FunctionCall] = None
for k, v in arguments_dict.items():
inferred_protocol_value = infer_protocol_value(value=v)
_field = litellm.types.llms.vertex_ai.Field(
key=k, value={inferred_protocol_value: v}
)
_fields = litellm.types.llms.vertex_ai.FunctionCallArgs(fields=_field)
function_call = litellm.types.llms.vertex_ai.FunctionCall(
name=name,
args=_fields,
)
return function_call
At line 946, function_call variable is initiated with None. In the next line, since there are no args, arguments_dict.items() is an empty list and the function ultimately returns None as if there is no function call at all.
Relevant log output
Traceback (most recent call last):
File "venv3/lib/python3.11/site-packages/litellm/main.py", line 481, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/vertex_and_google_ai_studio_gemini.py", line 1229, in async_completion
request_body = await async_transform_request_body(**data) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/transformation.py", line 407, in async_transform_request_body
return _transform_request_body(
^^^^^^^^^^^^^^^^^^^^^^^^
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/transformation.py", line 326, in _transform_request_body
raise e
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/transformation.py", line 297, in _transform_request_body
content = _gemini_convert_messages_with_history(messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/transformation.py", line 253, in _gemini_convert_messages_with_history
raise e
File "venv3/lib/python3.11/site-packages/litellm/llms/vertex_ai_and_google_ai_studio/gemini/transformation.py", line 216, in _gemini_convert_messages_with_history
convert_to_gemini_tool_call_invoke(assistant_msg)
File "venv3/lib/python3.11/site-packages/litellm/llms/prompt_templates/factory.py", line 1072, in convert_to_gemini_tool_call_invoke
raise Exception(
Exception: Unable to convert openai tool calls={'role': 'assistant', 'content': '', 'tool_calls': [{'id': 'call_2c384bc6-de46-4f29-8adc-60dd5805d305', 'function': {'name': 'Get-Mindspark-FAQ', 'arguments': '{}'}, 'type': 'function'}]} to gemini tool calls. Received error=function_call missing. Received tool call with 'type': 'function'. No function call in argument - {'id': 'call_2c384bc6-de46-4f29-8adc-60dd5805d305', 'function': {'name': 'Get-FAQ', 'arguments': '{}'}, 'type': 'function'}
What happened?
I have a chatbot that calls a function with no arguments. Calling the function with OpenAI models works just fine. When using Gemini 1.5 pro, the tool call is made and the results are written to the message history, but upon feeding the messages with the tool call results to the LLM, I get this error:
Looks like the problem is in litellm/llms/prompt_templates:
At line 946,
function_call
variable is initiated withNone
. In the next line, since there are no args, arguments_dict.items() is an empty list and the function ultimately returnsNone
as if there is no function call at all.Relevant log output
Twitter / LinkedIn details
No response