microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
33.88k stars 4.89k forks source link

[Issue]: chat_messages usage in 2 agents with tools #2968

Open kk2491 opened 4 months ago

kk2491 commented 4 months ago

Describe the issue

First of all, thank you all for making this awesome tool.

I have recently started using autogen, and the use case I am trying to achieve is the sequence of related queries one after the other in separate chats. In this case, I am using only 2 agents (one for use proxy and the other one is assistant).

For example, first question would be : create a resource in x. assistant and user proxy works together, uses the tools and creates the resource. Second followup question would be update the resource you just created. How can I provide the previous chat history, so that the agents know the context.

As per the documentation I see that the chat_messages can be used to provide the previous conversation. However I am not able to exactly figure out on how to populate this, could you please help me here?

I am using the below python script

import autogen  
from autogen.coding import LocalCommandLineCodeExecutor
from external_tools import ( 
    create_abc_resource,
    update_abc_resource,
)

def fetch_chat_history(chat_history):
    formatted_chat_history = []
    # for each_chat in chat_history:
    return chat_history

def run_autogen():

    config_list = [
        {
            "model" : "llama3-70b-8192",
            "api_type" : "open_ai",
            "base_url" : "API_URL",
            "api_key" : "XYZ"
        }
    ]

    llm_config = {
        "timeout" : 600,
        "seed" : 42, 
        "config_list" : config_list,
        "temperature" : 0
    }

    chat_history_list = []

    while True:

        user_proxy = autogen.ConversableAgent(
            name="User",
            llm_config=False,
            is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
            human_input_mode="NEVER",
        )

        assistant = autogen.ConversableAgent(
            name="Assistant",
            system_message="You are a helpful assistant. "
            "You can help with the simple calculations. "
            "Return 'TERMINATE' when the task is done. ",
            llm_config=llm_config,
            chat_messages={user_proxy : chat_history_list}
        )

        assistant.register_for_llm(name="create_abc_resource", description="Create a resource in ABC")(create_abc_resource)
        assistant.register_for_llm(name="update_abc_resource", description="Update a resource in ABC given the id")(update_abc_resource)        

        user_proxy.register_for_execution(name="create_abc_resource")(create_abc_resource)
        user_proxy.register_for_execution(name="update_abc_resource")(update_abc_resource)

        question = input("Enter Query : ")
        chat_result = user_proxy.initiate_chat(
            assistant,
            message=question,
        )
        print("=== chat result ===")   
        print(chat_result) 

        current_chat_history = fetch_chat_history(chat_result.chat_history)
        chat_history_list = chat_history_list + current_chat_history

    return

if __name__ == "__main__":
    run_autogen()

After the first query, the chat_result output is as given below:

ChatResult(
    chat_id=None, 
    chat_history=[{'content': 'create a resource in x', 'role': 'assistant'}, {'tool_calls': [{'id': 'call_fyzx', 'function': {'arguments': '{"input":{"name":"name"}}', 'name': 'create_abc_resource'}, 'type': 'function'}], 'content': None, 'role': 'assistant'}, {'content': '{"id": "66729022a205c5a5975e76f8", "name": "name",    "createdTimestamp": "2024-06-19T08:00:34.862Z", "lastUpdatedTimestamp": "2024-06-19T08:00:34.862Z"}', 'tool_responses': [{'tool_call_id': 'call_fyzx', 'role': 'tool', 'content': '{"id": "66729022a205c5a5975e76f8", "name": "name", "createdTimestamp": "2024-06-19T08:00:34.862Z", "lastUpdatedTimestamp": "2024-06-19T08:00:34.862Z"}'}], 'role': 'tool'}, {'content': 'The resource has been created in x. The ID is 66729022a205c5a5975e76f8.', 'role': 'user'}, {'content': '', 'role': 'assistant'}, {'content': 'TERMINATE', 'role': 'user'}], 
    summary='', 
    cost={'usage_including_cached_inference': {'total_cost': 0, 'llama3-70b-8192': {'cost': 0, 'prompt_tokens': 3767, 'completion_tokens': 122, 'total_tokens': 3889}}, 'usage_excluding_cached_inference': {'total_cost': 0, 'llama3-70b-8192': {'cost': 0, 'prompt_tokens': 3767, 'completion_tokens': 122, 'total_tokens': 3889}}}, 
    human_input=[]
)

How can I populate the chat_messages field in the agents, so that the agents know the previous context when I ask a new question?

Thank you,
KK

Steps to reproduce

No response

Screenshots and logs

No response

Additional Information

No response

Hk669 commented 4 months ago

hi @kk2491, thank you for your kind words, i think this tutorial should help you in resuming the groupchat. https://microsoft.github.io/autogen/docs/topics/groupchat/resuming_groupchat/

kk2491 commented 4 months ago

@Hk669 Thank you for providing the relevant material. I have gone through the details. In my case, I am looking for a chat between only 2 agents and how can I effectively use chat_messages field in the ConversableAgent to add the previous chat history.

Thank you,
KK

scruffynerf commented 4 months ago

I believe this is the key for you:

chat_result = user_proxy.initiate_chat(
            assistant,
            message=question,
        )

in a new chat, normally, you pass in a message which is a string, and the initiate_chat does the rest... if you look at that function, you'll see summary discussed and carryover/context but not how to add chat_history. You can pass a set of chat messages, with your question last, but look at your example.. the last two items:

{'content': '', 'role': 'assistant'}, 
{'content': 'TERMINATE', 'role': 'user'}

Probably don't want to pass those in...

BUT the /agentchat/chat.py does contain a function: initiate_chats() which includes args for:

 - `"carryover"` - It can be used to specify the carryover information to be passed
               to this chat. If provided, we will combine this carryover with the "message" content when
               generating the initial chat message in `generate_init_message`.
- `"finished_chat_indexes_to_exclude_from_carryover"` - It can be used by specifying a list of indexes of the finished_chats list, from which to exclude the summaries for carryover. If 'finished_chat_indexes_to_exclude_from_carryover' is not provided or an empty list, then summary from all the finished chats will be taken.

if you are doing summaries, this is helpful, less tokens to send/process. But this can be used to 'cleanup' the chat_history (+ current message) and give you a single 'message bundle' to send to initiate a new chat, with the history of previous discussions