microsoft / autogen

A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
28.12k stars 4.11k forks source link

[Bug]: Retrieve Agents not working with function calls #1469

Open lucascampodonico opened 5 months ago

lucascampodonico commented 5 months ago

Describe the bug

Hello, I have an error when I want to use agents group chat with rag with functions. Without rag it works perfectly, but with rag it throws an error due to the context.

I modified the code in autogen/agentchat)/contrib/retrieve_user_proxy_agent.py to make it work. I don't know if it's okay that way but at the moment it's working well for me.

Steps to reproduce

  def generate_llm_config(tool):
      # Define the function schema based on the tool's args_schema
      if tool.name == "appointment_scheduler":
          function_schema = {
              "name": tool.name.lower().replace(" ", "_"),
              "description": tool.description,
              "parameters": {
                  "type": "object",
                  "properties": {},
                  "required": ["date", "hour"],
              },
          }
          if tool.args is not None:
              function_schema["parameters"]["properties"] = tool.args
      else:
          function_schema = {
              "name": tool.name.lower().replace(" ", "_"),
              "description": tool.description,
              "parameters": {
                  "type": "object",
                  "properties": {},
                  "required": [],
              },
    }
    if tool.args is not None:
        function_schema["parameters"]["properties"] = tool.args

    return function_schema

llm_config_agent = {
    "functions": [
        generate_llm_config(custom_tool),
        generate_llm_config(google_search_tool),
        generate_llm_config(google_places_tool),
        generate_llm_config(appointment_scheduler_tool),
    ],
    "config_list": _config_list,
    "timeout": 60,
    "cache_seed": 42,
}

appointment_scheduler = autogen.AssistantAgent(
    name="appointment_scheduler",
    is_termination_msg=termination_msg,
    system_message="You are a helpful assistant for schedule appointment in ¡spanish language!, The date format to send to the function is YYYY-DD-MM hh:mm:ss. If you not have the date and hour in context, you ask for it with 'TERMINATE' in the end of answer!!. You answer that you do not have data to answer that question. Reply `TERMINATE` in the end when everything is done.",
    llm_config=llm_config_agent,
)

assistant = RetrieveAssistantAgent(
    name="assistant",
    is_termination_msg=termination_msg,
    system_message="You are a useful assistant to answer any questions in ¡spanish language! that are not related to the other agents. You have access to internet with google_search_tool for answer completly. If you not have context and you need context. You answer that you do not have data to answer that question. Reply `TERMINATE` in the end when everything is done.",
    llm_config=llm_config_agent,
)

property_informer = autogen.AssistantAgent(
    name="property_informer",
    is_termination_msg=termination_msg,
    system_message="""You are a helpful assistant for properties information in ¡spanish language!\n 
    You only answer for the information for each ROLE
    If role is agent: full access to answer.
    If role not is agent: You not answer about commission of property.
    You execute the functions to resolve answers.\n
    If you not have context and you need context.\n
    You answer that you do not have data to answer that question.\n
    Reply `TERMINATE` in the end when everything is done.""",
    llm_config=llm_config_agent,
)

ragproxyagent = RetrieveUserProxyAgent(
    name="ragproxyagent",
    is_termination_msg=termination_msg,
    human_input_mode="NEVER",
    max_consecutive_auto_reply=5,
    retrieve_config={
        "task": "code",
        "docs_path": docs_path,
        "chunk_token_size": 2000,
        "model": _config_list[0]["model"],
        "client": chromadb.PersistentClient(path="/tmp/chromadb"),
        "embedding_model": "all-mpnet-base-v2",
        "customized_prompt": PROMPT_CODE,
        "get_or_create": True,
        "collection_name": "agents_rag",
    },
    code_execution_config={"work_dir": "coding"},
)

# Register the tool and start the conversation
ragproxyagent.register_function(
    function_map={
        google_search_tool.name: google_search_tool._run,
        custom_tool.name: custom_tool._run,
        google_places_tool.name: google_places_tool._run,
        appointment_scheduler_tool.name: appointment_scheduler_tool._run,
    }
)

 groupchat = autogen.GroupChat(
        agents=[ragproxyagent, appointment_scheduler, assistant, property_informer]
        messages=[],
        max_round=12,
        speaker_selection_method="auto",
        allow_repeat_speaker=False,
    )

    manager = autogen.GroupChatManager(groupchat=groupchat, llm_config=llm_config)

    ragproxyagent.initiate_chat(manager, problem=problem, n_results=n_results)

Expected Behavior

Let the functions suggested by an agent work.

Screenshots and logs

Before

image

After

image

Additional Information

No response

ekzhu commented 5 months ago

Thanks for the issue. This could be a PR. @thinkall for awareness

davorrunje commented 5 months ago

@lucascampodonico which version of autogen are you using? What model are you using? If OpenAI on Azure, which api_version? I see you are using a deprecated version of llm_config using functions and not tools. That might cause a clash.

lucascampodonico commented 5 months ago

@davorrunje

I am using pyautogen==0.2.9

config_list = [ { "model": "gpt-4", "api_key": "sk-.........", }, ]

what is the new version of llm_config to use tools?

davorrunje commented 5 months ago

@lucascampodonico you can use function decorators @register_for_llm and @register_for_execution to automatically generate and add function specifications to your llm_config. OpenAI recently changed their API and functions are declared wrapped in tools JSON. You are using an old style without tools, but if you use decorators they will create the correct version of the JSON.

ekzhu commented 5 months ago

See example https://github.com/microsoft/autogen/blob/main/notebook/agentchat_function_call_currency_calculator.ipynb

sonichi commented 4 months ago

@thinkall please take a note of this issue and make sure you include @lucascampodonico in your RAG refactor issue/PR.

thinkall commented 4 months ago

Hi @lucascampodonico , have you tried the new APIs that @ekzhu and @davorrunje have suggested? Here #1661 you can also find the updated example of using RAG with functions.