microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
32.64k stars 4.75k forks source link

Improve function_call experience in group chat #274

Closed LittleLittleCloud closed 1 year ago

LittleLittleCloud commented 1 year ago

There're a series of issues/bug on function_call not found in group chat (#252 #152 )

The reason is because the function_call is implemented in two steps in autogen agents

And usually the two steps are completed by different agents. This brings problem in group chat, where group chat manager doesn't have information on which agent has the correct function_map to run a function_call. If group chat "guesses" the executor agent incorrectly, a function_call not found error will be thrown

There're three suggested options to fix it

For the first option, we can accommodate the code from @ilaffey2 and combine the admin and executor into the same agent so as to keep backward compatibility

from autogen import GroupChat, ConversableAgent, UserProxyAgent
from dataclasses import dataclass

@dataclass
class ExecutorGroupchat(GroupChat):
    def select_speaker(
        self, last_speaker: ConversableAgent, selector: ConversableAgent
    ):
        """Select the next speaker."""

        try:
            message = self.messages[-1]
            if "function_call" in message:
                return self.admin
        except Exception as e:
            print(e)
            pass

        selector.update_system_message(self.select_speaker_msg())
        final, name = selector.generate_oai_reply(
            self.messages
            + [
                {
                    "role": "system",
                    "content": f"Read the above conversation. Then select the next role from {self.agent_names} to play. Only return the role.",
                }
            ]
        )
        if not final:
            # i = self._random.randint(0, len(self._agent_names) - 1)  # randomly pick an id
            return self.next_agent(last_speaker)
        try:
            return self.agent_by_name(name)
        except ValueError:
            return self.next_agent(last_speaker)

For the second option, the group manager just rewire function_call back to its proposal agent

from autogen import GroupChat, ConversableAgent, UserProxyAgent
from dataclasses import dataclass

@dataclass
class ExecutorGroupchat(GroupChat):
    def select_speaker(
        self, last_speaker: ConversableAgent, selector: ConversableAgent
    ):
        """Select the next speaker."""

        try:
            message = self.messages[-1]
            if "function_call" in message:
                return self.last_speaker
        except Exception 
# rest of code goes here

For the third option, the group manager just sends function_call back to the next agent that has the correct key in its function_map

from autogen import GroupChat, ConversableAgent, UserProxyAgent
from dataclasses import dataclass

@dataclass
class ExecutorGroupchat(GroupChat):
    def select_speaker(
        self, last_speaker: ConversableAgent, selector: ConversableAgent
    ):
        """Select the next speaker."""

        try:
            message = self.messages[-1]
            if "function_call" in message:
                # for agent in self.agents:
                #    if message['function_call']['name'] in agent.function_map:
                #         return agent
        except Exception as e:
            print(e)
            pass
# ...

According to @sonichi suggestion, we can use a flag to toggle among the three options in order to accommodate the maximum flexibility.

sonichi commented 1 year ago

I have a proposal: add an argument in group chat. If the argument is set, when a function_call is suggested, we only pick the next agent among the agents who have the corresponding key in its function_map.

If no one else is willing to make this PR and I don't hear any objection by EOD today, I'll make a PR myself.

bonadio commented 1 year ago

Hi @LittleLittleCloud

I would like to add another option to your list of "suggested options", an agent that can run its own function_calls and just return the result, I created a sample one here https://gist.github.com/bonadio/96435a1b6ccc32297aa8cc1db7cfc381

LittleLittleCloud commented 1 year ago

I really like you idea on self-executing agent. In fact if all the group agents are self-executing, we won’t even have this function not found issue

back to your proposal, the original group chat should already work for you case so theoretically nothing needs to be changed in group chat to support your self executing agent. We can verify that once the pr out

milioe commented 10 months ago

With pyautogen==0.2.2 I tried to run three codes but I got errors from all of them.

This is my current code:

from autogen import config_list_from_json, AssistantAgent, UserProxyAgent, GroupChat, GroupChatManager, Agent, ConversableAgent
import os
import random
from dataclasses import dataclass
from dotenv import load_dotenv
load_dotenv()

@dataclass
from autogen import GroupChat, ConversableAgent, UserProxyAgent
from dataclasses import dataclass

@dataclass
class ExecutorGroupchat(GroupChat):
    def select_speaker(
        self, last_speaker: ConversableAgent, selector: ConversableAgent
    ):
        """Select the next speaker."""

        try:
            message = self.messages[-1]
            if "function_call" in message:
                return self.admin
        except Exception as e:
            print(e)
            pass

        selector.update_system_message(self.select_speaker_msg())
        final, name = selector.generate_oai_reply(
            self.messages
            + [
                {
                    "role": "system",
                    "content": f"Read the above conversation. Then select the next role from {self.agent_names} to play. Only return the role.",
                }
            ]
        )
        if not final:
            # i = self._random.randint(0, len(self._agent_names) - 1)  # randomly pick an id
            return self.next_agent(last_speaker)
        try:
            return self.agent_by_name(name)
        except ValueError:
            return self.next_agent(last_speaker)

def say_hello(name):
    return f"Hi, {name}, how are you doing?"

def say_goodbye(name):
    return f"bye, {name}, have a good day"

def write_txt(text):
    with open("output.txt", "w") as f:
        f.write(text)
    return "done"

config_list = config_list_from_json(env_or_file="OAI_CONFIG_LIST.json")

llm_config = {
    "functions": [
        {
            "name": "say_hello",
            "description": "Use this function to say hello to someone",
            "parameters": {
                "type": "object",
                "properties": {
                    "name": {
                        "type": "string",
                        "description": "The name of the person to say hello to",
                    },
                },
                "required": ["name"],
            },
        },
        {
            "name": "say_goodbye",
            "description": "Use this function to say goodbye to someone",
            "parameters": {
                "type": "object",
                "properties": {
                    "name": {
                        "type": "string",
                        "description": "The name of the person to say goodbye to",
                    },
                },
                "required": ["name"],
            },
        },
        {
            "name": "write_txt",
            "description": "Use this function to write content to a file",
            "parameters": {
                "type": "object",
                "properties": {
                    "text": {
                        "type": "string",
                        "description": "The text to write",
                    },
                },
                "required": ["text"],
            },
        },
    ],
    "config_list": config_list,
    "seed": 45,
    "request_timeout": 120
}

user_proxy = UserProxyAgent(
    name="user_proxy",
    system_message="A human that will provide the necessary information to the group chat manager. Execute suggested function calls.",
    function_map={
        "say_hello": say_hello,
        "say_goodbye": say_goodbye,
        "write_txt": write_txt,
    },
    human_input_mode="NEVER",
    code_execution_config={"work_dir": "fileread"})

assistant = AssistantAgent(
    name="assistant",
    system_message="""You are an assistant that proposes the execution of functions to the user proxy""",
    llm_config=llm_config
)

architect = AssistantAgent(
    name="azure_architect",
    system_message="""You are an architect that creates a plan in order for the assistant to execute the functions and complete the task""",
    llm_config={'config_list': config_list, 'seed': 45, 'request_timeout': 120},
)

groupchat = ExecutorGroupchat(agents=[user_proxy, assistant, architect], messages=[],max_round=20)

manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config, system_message="Choose one agent to play the role of the user proxy")

user_proxy.initiate_chat(
    manager,
    message="""say hello to thibault"""
    )

i got this error on three options:

Traceback (most recent call last):
  File "/Users/emiliosandoval/Documents/gen/Testing/group_w_funct.py", line 137, in <module>
    user_proxy.initiate_chat(
  File "/Users/emiliosandoval/opt/anaconda3/envs/gen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 556, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/Users/emiliosandoval/opt/anaconda3/envs/gen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 354, in send
    recipient.receive(message, self, request_reply, silent)
  File "/Users/emiliosandoval/opt/anaconda3/envs/gen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 487, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "/Users/emiliosandoval/opt/anaconda3/envs/gen/lib/python3.10/site-packages/autogen/agentchat/conversable_agent.py", line 962, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/Users/emiliosandoval/opt/anaconda3/envs/gen/lib/python3.10/site-packages/autogen/agentchat/groupchat.py", line 338, in run_chat
    speaker = groupchat.select_speaker(speaker, self)
  File "/Users/emiliosandoval/Documents/gen/Testing/group_w_funct.py", line 24, in select_speaker
    selector.update_system_message(self.select_speaker_msg())
TypeError: GroupChat.select_speaker_msg() missing 1 required positional argument: 'agents'

I noticed that if i pass the llm_configargument to manager (manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config, system_message="Choose one agent to play the role of the user proxy") this error appears: ValueError: GroupChatManager is not allowed to make function/tool calls. Please remove the 'functions' or 'tools' config in 'llm_config' you passed in.

Is it because the versions, the code or any of the prompts? I'd appreciate your help. @sonichi

sonichi commented 10 months ago

The llm_config passed to GroupChatManager can't contain functions or tools in pyautogen v0.2.2.

HARISHKUMAR1112001 commented 7 months ago

Is there any way to include the llm_config with functions or tools with GroupChatManager?

uthpala1000 commented 7 months ago

@HARISHKUMAR1112001 you meant like this ?

AssistantAgent( name="ExecutorAgent",

system_message= ""
 "if you face any issues provide the exact issue and any recomondations.   ",
llm_config={"config_list": config_list,  "functions":[check_weather]},
code_execution_config={"work_dir":"coding", "use_docker":False},

)