run-llama / multi-agent-concierge

An example of multi-agent orchestration with llama-index
MIT License
340 stars 55 forks source link

Unable Implement for Vertex LLM #13

Open stfines-clgx opened 1 month ago

stfines-clgx commented 1 month ago

So I attempted this with VertexAI using Gemini as the LLM - I verified that it is a function calling LLM and it crashes when it hits this line: tool_calls = llm.get_tool_calls_from_response( response, error_on_no_tool_call=False ) in the orchestrator() method - it seems that Vertex assumes that the tool_call will be a protocol buffer of some sort: AttributeError: 'FunctionCall' object has no attribute '_pb'

(the offending code in the vertex source: for tool_call in tool_calls: response_dict = MessageToDict(tool_call._pb) if "args" not in response_dict or "name" not in response_dict: raise ValueError("Invalid tool call.") argument_dict = response_dict["args"] [...]

What is being accomplished by turning the TransferToAgent message into a Tool? And can it be accomplished in some other way? I've had plenty of luck in invoking non-Function tools with gemini - it is only this function call one that seems to be giving me trouble.

logan-markewich commented 3 weeks ago

Seems like an issue with vertex -- possibly the generative-ai package updated and broke some underlying code. It would need a PR to fix

This function is just trying to parse the tool calls out of the llm response

noabenefraim commented 1 week ago

Same issue here. Is there a workaround?