Open M340i opened 1 month ago
@M340i it would be helpful to include the definitions of agent1, agent2, and agent3.
Since you did not provide a specific agent definition, I chose an empty agent for testing, and the results I obtained look similar to Perhaps by providing a more detailed definition of agent123, we can achieve the results you are looking for. Here is my simple test code:
llm_config = {"config_list":config_list,"cache_seed":42}
user_proxy = autogen.UserProxyAgent(
"user_proxy" ,
code_execution_config={"work_dir": "solution", "use_docker": False},
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
is_termination_msg=lambda x: x.get("content","").rstrip().endswith("TERMINATE"),
description="I stands for user."
)
task ="""draw a network topology with a central switch connected to a router and three other switch in txt"""
agent1=autogen.AssistantAgent(name="agent1",description="I stands for agent1.",llm_config=llm_config)
agent2=autogen.AssistantAgent(name="agent2",description="I stands for agent2.",llm_config=llm_config)
agent3=autogen.AssistantAgent(name="agent3",description="I stands for agent3.",llm_config=llm_config)
groupchat = autogen.GroupChat(agents=[agent1, agent2, agent3,user_proxy], messages=[], max_round=10)
manager = autogen.GroupChatManager(groupchat=groupchat, llm_config=llm_config)
user_proxy.send(task,recipient=manager,request_reply=True)
i use ollama+llama3.1 with u suggestion code . this is code:
from autogen import config_list_from_json, UserProxyAgent, AssistantAgent, GroupChat, GroupChatManager
config_list_llama = config_list_from_json(
"../OAI_CONFIG_LIST",
filter_dict={
"model": ["llama3.1"]
},
)
llm_config = {"config_list": config_list_llama, "cache_seed": 42, "temperature": 0, "price": [0.00001, 0.000001]}
user_proxy = UserProxyAgent(
"user_proxy",
code_execution_config={"work_dir": "autocode", "use_docker": False},
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
description="I stands for user."
)
task = """draw a network topology with a central switch connected to a router and three other switch in txt"""
agent1 = AssistantAgent(name="agent1", description="I stands for agent1.", llm_config=llm_config)
agent2 = AssistantAgent(name="agent2", description="I stands for agent2.", llm_config=llm_config)
agent3 = AssistantAgent(name="agent3", description="I stands for agent3.", llm_config=llm_config)
groupchat = GroupChat(agents=[agent1, agent2, agent3, user_proxy], messages=[], max_round=10)
manager = GroupChatManager(groupchat=groupchat, llm_config=llm_config)
user_proxy.send(task, recipient=manager, request_reply=True)
this is result:
/Users/mrz/Documents/Python/miniconda3/envs/autogen/bin/python /Users/mrz/Documents/Python/PYTHON/AutoGen/demo/mutiTask.py
user_proxy (to chat_manager):
draw a network topology with a central switch connected to a router and three other switch in txt
--------------------------------------------------------------------------------
Next speaker: agent1
agent1 (to chat_manager):
To draw a network topology as per your request, I'll first collect some information about the network. Here's a Python code block that will print out the required info:
```python
# filename: network_topology.py
print("Network Topology:")
print("-------------------")
print("Central Switch (SW1) connected to Router (RTR)")
print("SW1 connected to SW2, SW3 and SW4")
print("\nDevices:")
print("---------")
print("Router (RTR): connects the network to the internet")
print("Switches:")
print(" - SW1: central switch")
print(" - SW2: secondary switch")
print(" - SW3: tertiary switch")
print(" - SW4: quaternary switch")
print("\nConnections:")
print("------------")
print("SW1 -> RTR (Ethernet cable)")
print("SW1 -> SW2 (Ethernet cable)")
print("SW1 -> SW3 (Ethernet cable)")
print("SW1 -> SW4 (Ethernet cable)")
print("\nTopology Diagram:")
print("------------------")
print(" +---------------+")
print(" | |")
print(" | RTR |")
print(" | |")
print(" +---------------+")
print(" |")
print(" |")
print(" +---------------+")
print(" | |")
print(" | SW1 |")
print(" | |")
print(" +---------------+")
print(" |")
print(" |")
print(" +---------------+")
print(" | |")
print(" | SW2 |")
print(" | |")
print(" +---------------+")
print(" |")
print(" |")
print(" +---------------+")
print(" | |")
print(" | SW3 |")
print(" | |")
print(" +---------------+")
print(" |")
print(" |")
print(" +---------------+")
print(" | |")
print(" | SW4 |")
print(" | |")
print(" +---------------+")
Now, let's analyze the network topology. The central switch (SW1) is connected to a router (RTR), which connects the network to the internet. The central switch is also connected to three other switches: SW2, SW3, and SW4.
The devices in this network are:
The connections between devices are as follows:
This network topology can be visualized using the provided diagram. The central switch (SW1) serves as a hub for all other devices in the network.
TERMINATE
Next speaker: user_proxy
Process finished with exit code 0
but these assistants/agents did't execute the python script code. why?
Describe the bug
Steps to reproduce
no output at all
Model Used
Gemini-pro
Expected Behavior
this is from copilot
Screenshots and logs
No response
Additional Information
No response