microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
30.88k stars 4.51k forks source link

[Issue]: how to make agent use local tool? #1213

Closed timegoby closed 3 weeks ago

timegoby commented 8 months ago

Describe the issue

For example, I have a server which can realize text_to_speech. And now I want the agent can use this local tool when I ask it "read out the answer"

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

rickyloynd-microsoft commented 8 months ago

That could be done through function calling: https://github.com/microsoft/autogen/blob/main/notebook/agentchat_function_call_currency_calculator.ipynb

dangvansam commented 8 months ago

@rickyloynd-microsoft Nice example, but i'm getting error: An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response.

And so, is there a way for me to pass arguments directly to my function (my_generate_answer_local_function) without using the OpenAI API and get output of my function after call initiate_chat? This is my implement:

config = autogen.config_list_from_dotenv()
llm_config = {"config_list": config}

master_assistant = autogen.AssistantAgent(
    name="master_assistant",
    llm_config=llm_config,
    system_message=MASTER_ASSISTANT_PROMPT,
    human_input_mode="NEVER",
    code_execution_config=False,
)

user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    llm_config=llm_config,
    system_message=USER_PROXY_PROMPT,
    human_input_mode="NEVER",
    code_execution_config=False,
    max_consecutive_auto_reply=None,
    is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE")
)
@user_proxy.register_for_execution()
@master_assistant.register_for_llm(description="Answer any user question")
def generate_answer(
    param_1: Annotated[str, "param 1 description"],
    param_2: Annotated[str, "param 2 description"],
    param_3: Annotated[Dict[str, str], "param 3 description"]={}
)-> Dict[str, str]:
    return my_generate_answer_local_function(param_1, param_2, param_3)

user_proxy.initiate_chat(
    master_assistant,
    message=json.dumps({"param1": "1", "param2": "2", "param3": {"key": "value"}})
)
rickyloynd-microsoft commented 8 months ago

@yenif Are you a good reference for questions about tools?

namanbarkiya commented 7 months ago

is there any way to import array of tools from other file and use that decorator on each of them somehow?

ekzhu commented 7 months ago

Decorators are simply functions themselves so you can call decorators on them directly.

for function, description in functions:
  assistant.register_for_llm(description=description)(function)
namanbarkiya commented 7 months ago

Thanks a lot. This is how I have done this:

def function_name(param1: Annotated[str, "This is param 1"],
                       param2: Annotated[str, "This is param 2"], ... ) -> str:
    return f"something is returned"

user_proxy.register_for_execution()(function_name)
chatbot.register_for_llm(name="function_name",
                         description="This is some function")(function_name)

I can simply add a for loop there!.. and define the functions outside the file.