run-llama / multi-agent-concierge

An example of multi-agent orchestration with llama-index
MIT License
263 stars 38 forks source link

Unable Implement for Vertex LLM #13

Open stfines-clgx opened 8 hours ago

stfines-clgx commented 8 hours ago

So I attempted this with VertexAI using Gemini as the LLM - I verified that it is a function calling LLM and it crashes when it hits this line: tool_calls = llm.get_tool_calls_from_response( response, error_on_no_tool_call=False ) in the orchestrator() method - it seems that Vertex assumes that the tool_call will be a protocol buffer of some sort: AttributeError: 'FunctionCall' object has no attribute '_pb'

(the offending code in the vertex source: for tool_call in tool_calls: response_dict = MessageToDict(tool_call._pb) if "args" not in response_dict or "name" not in response_dict: raise ValueError("Invalid tool call.") argument_dict = response_dict["args"] [...]

What is being accomplished by turning the TransferToAgent message into a Tool? And can it be accomplished in some other way? I've had plenty of luck in invoking non-Function tools with gemini - it is only this function call one that seems to be giving me trouble.